iOS开发-实现声音录制AVAudioRecorder及播放AVAudioPlayer播放音频

iOS开发-实现声音录制AVAudioRecorder及播放AVAudioPlayer播放音频

之前开发中需要实现声音录制与播放功能。用到了AudioSession与AVAudioPlayer,这里记录一下实现过程及录制播放示例。
iOS开发-实现声音录制AVAudioRecorder及播放AVAudioPlayer播放音频_第1张图片

iOS开发-实现声音录制AVAudioRecorder及播放AVAudioPlayer播放音频_第2张图片

一、AVAudioSession是什么?

AVAudioSession是苹果用来管理App对音频硬件(I / O)的资源使用。
AudioSession配置影响所有的音频活动。可以查询Audio Session来发现设备的硬件特性。如声道数(channel count)、采样率(sample rate)、和音频输入的可用性(availability of audio unit)。

音频模式常用的几种场景:

默认是:CategorySoloAmbient(独占播放)

  • AVAudioSessionCategoryAmbient : 只用于播放音乐时,并且可以和QQ音乐同时播放,比如玩游戏的时候还想听QQ音乐的歌,那么把游戏播放背景音就设置成这种类别。同时,当用户锁屏或者静音时也会随着静音,这种类别基本适用所有App的背景场景。

  • AVAudioSessionCategorySoloAmbient: 也是只用于播放,但是和"AVAudioSessionCategoryAmbient"不同的是,用了它就别想听QQ音乐了,比如不希望QQ音乐干扰的App,类似节奏大师。同样当用户锁屏或者静音时也会随着静音,锁屏了就玩不了节奏大师了。

  • AVAudioSessionCategoryPlayback: 如果锁屏了还想听声音怎么办?用这个类别,比如App本身就是播放器,同时当App播放时,其他类似QQ音乐就不能播放了。所以这种类别一般用于播放器类App

  • AVAudioSessionCategoryRecord: 有了播放器,肯定要录音机,比如微信语音的录制,就要用到这个类别,既然要安静的录音,肯定不希望有QQ音乐了,所以其他播放声音会中断。想想微信语音的场景,就知道什么时候用他了。

  • AVAudioSessionCategoryPlayAndRecord: 如果既想播放又想录制该用什么模式呢?比如VoIP,打电话这种场景,PlayAndRecord就是专门为这样的场景设计的 。

  • AVAudioSessionCategoryMultiRoute: 想象一个DJ用的App,手机连着HDMI到扬声器播放当前的音乐,然后耳机里面播放下一曲,这种常人不理解的场景,这个类别可以支持多个设备输入输出。

  • AVAudioSessionCategoryAudioProcessing: 主要用于音频格式处理,一般可以配合AudioUnit进行使用

不同场景对应不同的session category,需要调用
AVAudioSession中的方法来setCategory设置

- (void)registerSesstion {
    _session = [AVAudioSession sharedInstance];
    NSError *sessionError;
    [_session setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
    
    if (_session == nil) {
        NSLog(@"Error creating session: %@", [sessionError description]);
    } else {
        [_session setActive:YES error:nil];
    }
}

在录制过程中使用到了AVAudioRecorder。
AVAudioRecorder提供了在应用程序中的音频Record录制功能。

二、AVAudioRecorder实现录制功能

实现录制,确定AVAudioRecorder所需要的录音格式、采样率、通道数、采集音频质量等

NSMutableDictionary *settings           = [[NSMutableDictionary alloc] init];
    //录音格式 无法使用
    [settings setValue:[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
    //采样率
    [settings setValue:[NSNumber numberWithFloat:11025.0] forKey:AVSampleRateKey]; //44100.0
    [settings setValue:[NSNumber numberWithFloat:38400.0] forKey:AVEncoderBitRateKey];
    //通道数
    [settings setValue:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
    //音频质量,采样质量
    [settings setValue:[NSNumber numberWithInt:AVAudioQualityMin] forKey:AVEncoderAudioQualityKey];
    _recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&recorderSetupError];

AVAudioRecorder的delegate方法

#pragma mark - AVAudioRecorderDelegate代理
/* audioRecorderDidFinishRecording:successfully: is called when a recording has been finished or stopped. This method is NOT called if the recorder is stopped due to an interruption. */
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag {	
     // 结束录制时候
    [self resetRecorder];
}

/* if an error occurs while encoding it will be reported to the delegate. */
- (void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)recorder error:(NSError * __nullable)error {
    // 录制出现错误的时候
    [self resetRecorder];
}

音频录制后将音频保存到本地文件中

完整录制代码如下:

#import "SDAudioRecordManager.h"
#import <AVFoundation/AVFoundation.h>

#define kMaxRecordTime  60

static SDAudioRecordManager *manager = nil;

/**
 音频录音工具
 */
@interface SDAudioRecordManager () <AVAudioRecorderDelegate>

@property (nonatomic, strong) AVAudioSession  *session;

@property (nonatomic, strong) AVAudioRecorder *recorder;

@property (nonatomic, strong) NSString *audioFileName;

@property (nonatomic, strong) NSString *audioPath;

@property (nonatomic, strong) NSTimer *timer;

@property (nonatomic, assign) float curCount;

@end

@implementation SDAudioRecordManager

+ (instancetype)shareInstance {
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        manager = [[SDAudioRecordManager alloc] init];
    });
    return manager;
}

- (void)registerRecordManager {
    _audioFileName = [self getCurrentTimeString];
    _audioPath = [self getWavPathByFileName:_audioFileName];
    _recorder = nil;
    NSError             *recorderSetupError = nil;
    NSURL               *url                = [NSURL fileURLWithPath:_audioPath];
    NSMutableDictionary *settings           = [[NSMutableDictionary alloc] init];
    //录音格式 无法使用
    [settings setValue:[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];
    //采样率
    [settings setValue:[NSNumber numberWithFloat:11025.0] forKey:AVSampleRateKey]; //44100.0
    [settings setValue:[NSNumber numberWithFloat:38400.0] forKey:AVEncoderBitRateKey];
    //通道数
    [settings setValue:[NSNumber numberWithInt:2] forKey:AVNumberOfChannelsKey];
    //音频质量,采样质量
    [settings setValue:[NSNumber numberWithInt:AVAudioQualityMin] forKey:AVEncoderAudioQualityKey];
    _recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&recorderSetupError];
    if (recorderSetupError) {
        NSLog(@"%@", recorderSetupError);
    }
    _recorder.meteringEnabled = YES;
    _recorder.delegate = self;
    [_recorder prepareToRecord];
}

- (void)registerSesstion {
    _session = [AVAudioSession sharedInstance];
    NSError *sessionError;
    [_session setCategory:AVAudioSessionCategoryPlayAndRecord error:&sessionError];
    
    if (_session == nil) {
        NSLog(@"Error creating session: %@", [sessionError description]);
    } else {
        [_session setActive:YES error:nil];
    }
}

- (BOOL)checkCanRecord {
    __block BOOL bCanRecord = YES;
    AVAudioSession *audioSession = [AVAudioSession sharedInstance];
    AVAudioSessionRecordPermission avasrp = [audioSession recordPermission];
    if (avasrp == AVAudioSessionRecordPermissionGranted) {
        //权限许可
        bCanRecord = YES;
    } else if (avasrp == AVAudioSessionRecordPermissionUndetermined) {
        //权限未确定
        bCanRecord = YES;
    } else if (avasrp == AVAudioSessionRecordPermissionDenied) {
        //没有权限
        bCanRecord = NO;
    }
    
    return bCanRecord;
}

- (void)startRecord {
    [self registerRecordManager];
    [self registerSesstion];
    [self startTimer];
    [_recorder record];
}

- (void)finishRecord {
    double cTime = _recorder.currentTime;
    [_recorder stop];

    if (cTime > 1) {
        [_audioRecorderDelegate audioRecordFinished:_audioFileName filePath:_audioPath duration:cTime];
    } else {
        [_recorder deleteRecording];
        [self deleteFileAtPath:_audioPath];
    }
}

- (void)cancelRecord {
    if ([_recorder respondsToSelector:@selector(stop)]) {
        [_recorder stop];
    }
    if ([_recorder respondsToSelector:@selector(deleteRecording)]) {
        [_recorder deleteRecording];
    }
    
    [self deleteFileAtPath:_audioPath];
}

- (BOOL)isRecording {
    return [_recorder isRecording];
}

#pragma mark - 定时器
- (void)startTimer {
    if (_timer){
        [_timer invalidate];_timer = nil;
    }
    _timer = [NSTimer scheduledTimerWithTimeInterval:0.1f target:self selector:@selector(updateMeters) userInfo:nil repeats:YES];
}

- (void)stopTimer {
    if (_timer && _timer.isValid){
        [_timer invalidate];_timer = nil;
    }
}

- (void)updateMeters {
    if (_recorder.isRecording){
        
        [_recorder updateMeters];
        _curCount += 0.1f;
        
        [_audioRecorderDelegate audioRecordUpdateMeter:pow(10, (0.05 * [_recorder peakPowerForChannel:0])) duration:_curCount];
        
        if (_curCount >= kMaxRecordTime){
            [self finishRecord];
        }
    }
}

- (void)resetRecorder {
    [self stopTimer];
    _curCount = 0;
}

- (NSInteger)audioRecordDuration {
    return _recorder.currentTime;
}

- (float)recordMeter {
    float meter = 0.0;
    if (_recorder.isRecording){
        
        [_recorder updateMeters];
        meter = pow(10, (0.05 * [_recorder peakPowerForChannel:0]));
        NSLog(@"recordMeter:%f",meter);
    }
    return meter;
}

#pragma mark - AVAudioRecorderDelegate代理
/* audioRecorderDidFinishRecording:successfully: is called when a recording has been finished or stopped. This method is NOT called if the recorder is stopped due to an interruption. */
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag {
    [self resetRecorder];
}

/* if an error occurs while encoding it will be reported to the delegate. */
- (void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)recorder error:(NSError * __nullable)error {
    [self resetRecorder];
}

#pragma mark - 监听听筒or扬声器
- (void)handleNotification:(BOOL)state {
    [[UIDevice currentDevice] setProximityMonitoringEnabled:state]; //建议在播放之前设置yes,播放结束设置NO,这个功能是开启红外感应
    if(state){//添加监听
        [[NSNotificationCenter defaultCenter] addObserver:self
                                                 selector:@selector(proximitySensorChange:) name:UIDeviceProximityStateDidChangeNotification
                                                   object:nil];
    } else {//移除监听
        [[NSNotificationCenter defaultCenter] removeObserver:self name:UIDeviceProximityStateDidChangeNotification object:nil];
    }
}

-(void)proximitySensorChange:(NSNotificationCenter *)notification; {
    if ([[UIDevice currentDevice] proximityState] == YES){
        [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
    } else {
        [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
    }
}


#pragma mark - Path Utils
- (NSString*)getCurrentTimeString {
    NSDateFormatter * dateformat = [[NSDateFormatter  alloc]init];
    [dateformat setDateFormat:@"yyyyMMddHHmmss"];
    return [dateformat stringFromDate:[NSDate date]];
}

- (NSString *)getWavPathByFileName:(NSString *)fileName {
    return [[[self creatFolder:@"WAV" at:[self getDirLibPath]] stringByAppendingPathComponent:fileName] stringByAppendingPathExtension:@"wav"];
}

- (NSString *)getAmrPathByFileName:(NSString *)fileName {
    return [[[self creatFolder:@"AMR" at:[self getDirLibPath]] stringByAppendingPathComponent:fileName] stringByAppendingPathExtension:@"amr"];
}

//获取Library目录
- (NSString *)getDirLibPath {
    //[NSHomeDirectory() stringByAppendingPathComponent:@"Library"];
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSLibraryDirectory, NSUserDomainMask, YES);
    return [paths objectAtIndex:0];
}
    
- (NSString *)creatFolder:(NSString *)folderName at:(NSString *)dirPath {
    NSString *temDirectory = [dirPath stringByAppendingPathComponent:folderName];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    // 创建目录
    BOOL res=[fileManager createDirectoryAtPath:temDirectory withIntermediateDirectories:YES attributes:nil error:nil];
    if (res) {
        return temDirectory;
    }else{
        return temDirectory;
    }
}

- (BOOL)deleteFileAtPath:(NSString*)path {
    return [[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}

@end

三、AVAudioPlayer实现音频播放功能

AVAudioPlayer属于AVFoundation框架,可以播放音频。
AVAudioPlayer初始化initWithData,initWithContentsOfURL分别可以根据NSData、path来播放音频。
AVAudioPlayer的delegate方法,这里用到一下两个

- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
    // 播放晚场
}

/* if an error occurs while decoding it will be reported to the delegate. */
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError * __nullable)error {
    // 解码过程中出现错误
}

音频播放完整代码

#import "SDAudioPlayerManager.h"
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>

static SDAudioPlayerManager *manager = nil;

@interface SDAudioPlayerManager ()<AVAudioPlayerDelegate>

@property (nonatomic, strong) AVAudioPlayer *player;

@property (nonatomic, copy) SDAudioPlayerCompletion compltionBlock;

@end

@implementation SDAudioPlayerManager

+ (instancetype)shareInstance {
    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        manager = [[SDAudioPlayerManager alloc] init];
        [manager setAudioSession];
    });
    return manager;
}

/**
 配置audioSession
 */
- (void)setAudioSession {
    NSError *error = nil;
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
    if (nil != error) {
        NSLog(@"set Category error %@", error.localizedDescription);
    }
    AVAudioSessionCategoryOptions options = [[AVAudioSession sharedInstance] categoryOptions];
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&error];
    if (nil != error) {
        NSLog(@"set Option error %@", error.localizedDescription);
    }
    options = [[AVAudioSession sharedInstance] categoryOptions];
}

/**
 播放语音data
 
 @param data data
 */
- (void)startPlayerData:(NSData *)data compltion:(SDAudioPlayerCompletion)compltion{
    if (_player) {
        [_player stop];
        _player = nil;
    }
    
    _player = [[AVAudioPlayer alloc] initWithData:data error:nil];
    _player.meteringEnabled = YES;
    _player.delegate = self;
    [_player play];
    
    self.compltionBlock = compltion;
}

/**
 播放本地路径
 
 @param filePath 路径
 */
- (void)startPlayerFilePath:(NSString *)filePath compltion:(SDAudioPlayerCompletion)compltion{
    if (_player) {
        [_player stop];
        _player = nil;
    }
    
    _player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL URLWithString:filePath] error:nil];
    _player.meteringEnabled = YES;
    _player.delegate = self;
    [_player play];
    
    self.compltionBlock = compltion;
}

/**
 暂停播放
 */
- (void)stopPlayer {
    if (_player) {
        [_player stop];
    }
}

#pragma mark  AVAudioPlayerDelegate
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
    [self stopPlayer];
    if (self.compltionBlock) {
        self.compltionBlock(YES);
    }
}

/* if an error occurs while decoding it will be reported to the delegate. */
- (void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError * __nullable)error {
    [self stopPlayer];
    if (self.compltionBlock) {
        self.compltionBlock(YES);
    }
}

四、实现UI

4.1 水波纹效果

在现实录制的过程中,需要用到动画,这里实现水波纹动画效果。
水波纹效果其实就是基础动画CAAnimationGroup,其包括缩放动画、透明度动画
代码如下

- (void)waveFadeDuration:(CGFloat)aDuration
                   delay:(NSTimeInterval)delay
             repeatCount:(NSInteger)repeatCount
            animationKey:(NSString *)animationKey
                    view:(UIView *)view {
    
    CABasicAnimation *opacityAnima = [CABasicAnimation animationWithKeyPath:@"opacity"];
    opacityAnima.fromValue = @(1.0);
    opacityAnima.toValue = @(0.0);
    
    CABasicAnimation *scaleAnima = [CABasicAnimation animationWithKeyPath:@"transform"];
    scaleAnima.fromValue = [NSValue valueWithCATransform3D:CATransform3DScale(CATransform3DIdentity, 0.0, 0.0, 0.0)];
    scaleAnima.toValue = [NSValue valueWithCATransform3D:CATransform3DScale(CATransform3DIdentity, 1.0, 1.0, 0.0)];
    
    CAAnimationGroup *groupAnimation = [CAAnimationGroup animation];
    groupAnimation.animations = @[opacityAnima,scaleAnima];
    groupAnimation.duration = aDuration;
    groupAnimation.beginTime = delay +CACurrentMediaTime();
    groupAnimation.autoreverses = NO;
    groupAnimation.removedOnCompletion = NO;
    groupAnimation.repeatCount = MAXFLOAT;
    [view.layer addAnimation:groupAnimation forKey:animationKey];
}

我这里通过绘制方式,绘制三个圆。

- (void)drawRect:(CGRect)rect {
    // Drawing code
    
    UIColor *color = self.waveColor;
    if (self.waveColor == nil) {
        color = [UIColor colorWithRed:138.0/255.0 green:204.0/255.0 blue:255.0/255.0 alpha:1.0];
    }
    CGContextRef context = UIGraphicsGetCurrentContext();
    
    //设置填充颜色
    CGContextSetFillColor(context, CGColorGetComponents(color.CGColor));
    
    //画笔线的颜色
    CGContextSetStrokeColor(context, CGColorGetComponents(color.CGColor));
    
    CGContextSetLineWidth(context, 0.0);//线的宽度
    CGContextAddEllipseInRect(context, CGRectMake(0.0, 0.0, CGRectGetWidth(rect), CGRectGetHeight(rect))); //椭圆
    CGContextDrawPath(context, kCGPathFillStroke);
}

整体水波纹效果代码

#import "SDRecorderWaterView.h"


@implementation SDWaterWaveView

- (void)setWaveColor:(UIColor *)waveColor {
    _waveColor = waveColor;
    [self setNeedsDisplay];
}

// Only override drawRect: if you perform custom drawing.
// An empty implementation adversely affects performance during animation.
- (void)drawRect:(CGRect)rect {
    // Drawing code
    
    UIColor *color = self.waveColor;
    if (self.waveColor == nil) {
        color = [UIColor colorWithRed:138.0/255.0 green:204.0/255.0 blue:255.0/255.0 alpha:1.0];
    }
    CGContextRef context = UIGraphicsGetCurrentContext();
    
    //设置填充颜色
    CGContextSetFillColor(context, CGColorGetComponents(color.CGColor));
    
    //画笔线的颜色
    CGContextSetStrokeColor(context, CGColorGetComponents(color.CGColor));
    
    CGContextSetLineWidth(context, 0.0);//线的宽度
    CGContextAddEllipseInRect(context, CGRectMake(0.0, 0.0, CGRectGetWidth(rect), CGRectGetHeight(rect))); //椭圆
    CGContextDrawPath(context, kCGPathFillStroke);
}

@end

@interface SDRecorderWaterView ()

@property (nonatomic, strong) NSMutableArray *waveItems;

@end

@implementation SDRecorderWaterView

- (instancetype)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self) {
        self.waveItems = [NSMutableArray arrayWithCapacity:0];
        self.waveCount = 2; // 默认值为2
        self.waveColor = [UIColor colorWithRed:220.0/255.0 green:89.0/255.0 blue:89.0/255.0 alpha:1.0];
        self.isAnimating = NO;
        self.waveSize = CGSizeMake(50.0, 50.0);
    }
    return self;
}

- (void)layoutSubviews {
    [super layoutSubviews];
    
    CGFloat wX = (CGRectGetWidth(self.bounds) - self.waveSize.width)/2.0;
    CGFloat wY = (CGRectGetHeight(self.bounds) - self.waveSize.height)/2.0;
    for (UIView *subView in self.subviews) {
        if ([subView isKindOfClass:[SDWaterWaveView class]]) {
            SDWaterWaveView *waveView = (SDWaterWaveView *)subView;
            waveView.frame = CGRectMake(wX, wY, self.waveSize.width, self.waveSize.height);
            [waveView setNeedsDisplay];
        }
    }
}

- (void)setWaveCount:(NSInteger)waveCount {
    _waveCount = waveCount;
    
    [self.waveItems removeAllObjects];
    
    for (UIView *subView in self.subviews) {
        if ([subView isKindOfClass:[SDWaterWaveView class]]) {
            [subView removeFromSuperview];
        }
    }
    
    for (NSInteger index = 0; index < waveCount; index++) {
        SDWaterWaveView *waveView = [[SDWaterWaveView alloc] initWithFrame:CGRectZero];
        waveView.backgroundColor = [UIColor clearColor];
        waveView.layer.opacity = 0.0;
        waveView.tagIndex = index;

        [self addSubview:waveView];
        [self.waveItems addObject:waveView];
    }
    [self setNeedsLayout];
}

- (void)setWaveSize:(CGSize)waveSize {
    _waveSize = waveSize;
    [self setNeedsLayout];
}

- (void)setWaveColor:(UIColor *)waveColor {
    _waveColor = waveColor;
    for (UIView *subView in self.subviews) {
        if ([subView isKindOfClass:[SDWaterWaveView class]]) {
            SDWaterWaveView *waveView = (SDWaterWaveView *)subView;
            waveView.waveColor = waveColor;
        }
    }
}

/// 开启动画
- (void)startAnimation {
    self.isAnimating = YES;
    for (UIView *subView in self.subviews) {
        if ([subView isKindOfClass:[SDWaterWaveView class]]) {
            SDWaterWaveView *waveView = (SDWaterWaveView *)subView;
            NSInteger tagIndex = waveView.tagIndex;
            [self waveFadeDuration:(self.waveCount*0.5) delay:(0.5*tagIndex) repeatCount:self.waveCount animationKey:@"waveViewAnimation" view:waveView];
        }
    }
}

/// 停止动画
- (void)stopAnimation {
    for (UIView *subView in self.subviews) {
        if ([subView isKindOfClass:[SDWaterWaveView class]]) {
            SDWaterWaveView *waveView = (SDWaterWaveView *)subView;
            [waveView.layer removeAnimationForKey:@"waveViewAnimation"];
            waveView.layer.opacity = 0.0;
        }
    }
    self.isAnimating = NO;
}

- (void)waveFadeDuration:(CGFloat)aDuration
                   delay:(NSTimeInterval)delay
             repeatCount:(NSInteger)repeatCount
            animationKey:(NSString *)animationKey
                    view:(UIView *)view {
    
    CABasicAnimation *opacityAnima = [CABasicAnimation animationWithKeyPath:@"opacity"];
    opacityAnima.fromValue = @(1.0);
    opacityAnima.toValue = @(0.0);
    
    CABasicAnimation *scaleAnima = [CABasicAnimation animationWithKeyPath:@"transform"];
    scaleAnima.fromValue = [NSValue valueWithCATransform3D:CATransform3DScale(CATransform3DIdentity, 0.0, 0.0, 0.0)];
    scaleAnima.toValue = [NSValue valueWithCATransform3D:CATransform3DScale(CATransform3DIdentity, 1.0, 1.0, 0.0)];
    
    CAAnimationGroup *groupAnimation = [CAAnimationGroup animation];
    groupAnimation.animations = @[opacityAnima,scaleAnima];
    groupAnimation.duration = aDuration;
    groupAnimation.beginTime = delay +CACurrentMediaTime();
    groupAnimation.autoreverses = NO;
    groupAnimation.removedOnCompletion = NO;
    groupAnimation.repeatCount = MAXFLOAT;
    [view.layer addAnimation:groupAnimation forKey:animationKey];
}

@end

4.2 实现录制过程中的频谱效果

录制过程中的频谱效果是根据CAShapeLayer组合UIBezierPath实现
CAShapeLayer 是 CALayer 的子类,可以画出各种图形,比 CALayer 更灵活。
CAShapeLayer 包括path,fillColor,fillRule,strokeColor,strokeStart,strokeEnd,lineWidth(线宽,用点表示单位),miterLimit,lineCap(线条结尾的样子),lineJoin(线条之间的结合点的样子), lineDashPhase 和lineDashPattern 这几个属性。

根据取得通道的音频来更新频谱波动效果。
录制过程中可以peakPowerForChannel meter

- (float)recordMeter {
    float meter = 0.0;
    if (_recorder.isRecording){
        
        [_recorder updateMeters];
        meter = pow(10, (0.05 * [_recorder peakPowerForChannel:0]));
        NSLog(@"recordMeter:%f",meter);
    }
    return meter;
}

完整频谱波动效果

#import "SDRecorderSpectrumView.h"

@interface SDRecorderSpectrumView ()

@property (nonatomic, strong) NSMutableArray *levels;
@property (nonatomic, strong) NSMutableArray *itemLineLayers;

@property (nonatomic, strong) CADisplayLink *displayLink;

@end

@implementation SDRecorderSpectrumView

- (instancetype)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self) {
        self.numberOfItems = 12;
        self.itemColor = [UIColor colorWithRed:241/255.f green:60/255.f blue:57/255.f alpha:1.0];
        self.itemWidth = 10.0;
        self.itemHeight = 20.0;
    }
    return self;
}

- (void)setNumberOfItems:(NSUInteger)numberOfItems {
    if (_numberOfItems == numberOfItems) {
        return;
    }
    _numberOfItems = numberOfItems;

    self.levels = [[NSMutableArray alloc]init];
    for(int i = 0 ; i < self.numberOfItems; i++){
        [self.levels addObject:@(0)];
    }


    for (CAShapeLayer *itemLine in self.itemLineLayers) {
        [itemLine removeFromSuperlayer];
    }
    self.itemLineLayers = [NSMutableArray array];
    for(int i=0; i < numberOfItems; i++) {
        CAShapeLayer *itemLine = [CAShapeLayer layer];
        itemLine.lineCap       = kCALineCapRound;
        itemLine.lineJoin      = kCALineJoinRound;
        itemLine.strokeColor   = [[UIColor clearColor] CGColor];
        itemLine.fillColor     = [[UIColor clearColor] CGColor];
        itemLine.strokeColor   = [self.itemColor CGColor];
        itemLine.lineWidth     = self.itemWidth;

        [self.layer addSublayer:itemLine];
        [self.itemLineLayers addObject:itemLine];
    }
}

- (void)setItemWidth:(CGFloat)itemWidth {
    _itemWidth = itemWidth;
    for (CAShapeLayer *itemLine in self.itemLineLayers) {
        itemLine.lineWidth = self.itemWidth;
    }
}

- (void)setItemHeight:(CGFloat)itemHeight {
    _itemHeight = itemHeight;
}

- (void)setItemColor:(UIColor *)itemColor {
    _itemColor = itemColor;
    for (CAShapeLayer *itemLine in self.itemLineLayers) {
        itemLine.strokeColor = [itemColor CGColor];
    }
}

- (void)setItemBlock:(SDRecorderSpectrumItemBlock)itemBlock {
    _itemBlock = itemBlock;
    
    [self startAnimation];
}

- (void)setLevel:(CGFloat)level {
    if( level < 0 ) level = 0;
    _level = level;

    // 移除最后一个
    [self.levels removeObjectAtIndex:(self.numberOfItems-1)];
    
    // 在第一个加入新的
    [self.levels insertObject:@(level) atIndex:0];
    NSLog(@"self.levels:%@",self.levels);
    
    [self updateItems];
}

/// 开启动画
- (void)startAnimation {
    // CADisplayLink或者NSTimer的target是block的时候,可以使用selector:@selector(invoke),这样定时器就可以调用block了
    if (self.displayLink == nil) {
        self.displayLink = [CADisplayLink displayLinkWithTarget:self.itemBlock selector:@selector(invoke)];
        self.displayLink.frameInterval = 3.f;
        [self.displayLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];
    }
}

/// 停止动画
- (void)stopAnimation {
    [self.displayLink invalidate];
    self.displayLink = nil;
}

#pragma mark - update
- (void)updateItems {
    UIGraphicsBeginImageContext(self.frame.size);
    
    CGFloat height = CGRectGetHeight(self.bounds);
    CGFloat width = CGRectGetWidth(self.bounds);

    CGFloat lineOffset = 0.0;
    if (self.numberOfItems > 1) {
        lineOffset = (width - self.itemWidth*self.numberOfItems)/(self.numberOfItems - 1);
    }
    
    CGFloat from = 0.0;
    
    // 上下的间距
    CGFloat distance =  (height - self.itemHeight);
    for(int i = 0; i < self.numberOfItems; i++) {
        CGFloat lineHeight = self.itemHeight + [self.levels[i] floatValue] * distance;
        //([[self.levels objectAtIndex:i]intValue]+1)*self.lineWidth/2.f;
        CGFloat lineTop = (height - lineHeight) / 2.f;
        CGFloat lineBottom = lineTop + lineHeight;

        if (i == 0) {
            from += lineOffset;
        } else {
            from += (lineOffset + self.itemWidth);
        }

        UIBezierPath *linePathRight = [UIBezierPath bezierPath];
        [linePathRight moveToPoint:CGPointMake(from, lineTop)];
        [linePathRight addLineToPoint:CGPointMake(from, lineBottom)];
        CAShapeLayer *itemLine = [self.itemLineLayers objectAtIndex:i];
        itemLine.path = [linePathRight CGPath];
    }
    
    UIGraphicsEndImageContext();
}

@end

4.3 录制的ViewController

ViewController包括UI及调用录制Record与播放的动画效果。
完整代码如下

#import "SDAudioRecorderExampleViewController.h"
#import "SDAudioRecordManager.h"
#import "SDAudioPlayerManager.h"
#import "SDRecorderSpectrumView.h"
#import "SDRecorderWaterView.h"
#import "SDRecorderButton.h"

@interface SDAudioRecorderExampleViewController ()<SDAudioRecorderDelegate, SDRecorderButtonDelegate>

@property (strong, nonatomic) SDRecorderButton *recordButton;
@property (strong, nonatomic) UILabel *tipLabel;
@property (strong, nonatomic) UILabel *timeLabel;
@property (strong, nonatomic) SDRecorderSpectrumView *spectrumView;
@property (strong, nonatomic) SDRecorderWaterView *recorderWaterView;

@end

@implementation SDAudioRecorderExampleViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    self.view.backgroundColor = [UIColor whiteColor];
    
    [SDAudioRecordManager shareInstance].audioRecorderDelegate = self;
    
    [self.view addSubview:self.recorderWaterView];
    [self.view addSubview:self.recordButton];
    [self.view addSubview:self.tipLabel];
    [self.view addSubview:self.timeLabel];
    
    [self.view addSubview:self.spectrumView];
    
    __weak typeof(self) weakSelf = self;
    self.spectrumView.itemBlock = ^{
        // 实时更新level
        float recordMeter = [[SDAudioRecordManager shareInstance] recordMeter];
        weakSelf.spectrumView.level = recordMeter;
    };
}


#pragma mark - ControlEvents
- (void)recordStart:(UIButton *)button {
    if (![[SDAudioRecordManager shareInstance] isRecording]) {
        self.tipLabel.text = @"正在录音";
        [[SDAudioRecordManager shareInstance] startRecord];
        NSLog(@"开始动画");
        [self.recorderWaterView startAnimation];
        
        [self.spectrumView startAnimation];
    }
}


- (void)recordCancel:(UIButton *)button {
    if ([[SDAudioRecordManager shareInstance] isRecording]) {
        self.tipLabel.text = @"";
        self.timeLabel.text = @"";
        NSLog(@"取消");
        [[SDAudioRecordManager shareInstance] cancelRecord];
        
        // 结束动画
        NSLog(@"结束动画");
        [self.recorderWaterView stopAnimation];
        
        [self.spectrumView stopAnimation];
    }
}

- (void)recordFinish:(UIButton *)button {
    if ([[SDAudioRecordManager shareInstance] isRecording]) {
        self.tipLabel.text = @"";
        self.timeLabel.text = @"";

        NSLog(@"完成");
        [[SDAudioRecordManager shareInstance] finishRecord];
        
        // 结束动画
        NSLog(@"结束动画");
        [self.recorderWaterView stopAnimation];
        
        [self.spectrumView stopAnimation];
    }
}

- (void)recordTouchDragExit:(UIButton *)button {
    if ([[SDAudioRecordManager shareInstance] isRecording]) {
        self.tipLabel.text = @"松开取消";
        // [self stopAnimate];
        // 结束动画
        NSLog(@"结束动画");
        [self.recorderWaterView stopAnimation];
        
        [self.spectrumView stopAnimation];
    }
}

- (void)recordTouchDragEnter:(UIButton *)button {
    if ([[SDAudioRecordManager shareInstance] isRecording]) {
        self.tipLabel.text = @"正在录音";
        // 开始动画
        NSLog(@"开始动画");
        [self.recorderWaterView startAnimation];
        
        [self.spectrumView startAnimation];
    }
}

#pragma mark - SDRecorderButtonDelegate
- (void)endTracking {
    if ([[SDAudioRecordManager shareInstance] isRecording]) {
        NSLog(@"结束动画");
        [self.recorderWaterView stopAnimation];
        
        [self.spectrumView stopAnimation];
    }
}

- (void)cancelTracking {
    if ([[SDAudioRecordManager shareInstance] isRecording]) {
        NSLog(@"结束动画");
        [self.recorderWaterView stopAnimation];
        
        [self.spectrumView stopAnimation];
    }
}

#pragma mark - SDAudioRecorderDelegate
- (void)audioRecordUpdateMeter:(float)meter duration:(NSInteger)duration {
    // 调用刷新仪表值
    self.timeLabel.text = [NSString stringWithFormat:@"meter:%f,duration:%ld",meter,(long)duration];
    
    // meter范围0~1
    // self.spectrumView.level = meter;
}

- (void)audioRecordFinished:(NSString *)aSoundName filePath:(NSString *)filePath duration:(NSInteger)duration {
    NSLog(@"filePath:%@", filePath);
    [[SDAudioPlayerManager shareInstance] startPlayerFilePath:filePath compltion:^(BOOL finished) {
        NSLog(@"播放完成:%@", (finished?@"YES":@"NO"));
    }];
}


#pragma mark - layout
- (void)viewDidLayoutSubviews {
    [super viewDidLayoutSubviews];

    CGFloat width = self.view.bounds.size.width;
    CGFloat height = self.view.bounds.size.height;

    self.recordButton.frame = CGRectMake(width / 2.f - 50.f, height - 180.f, 100.f, 100.f);
    
    self.recorderWaterView.frame = CGRectMake(width / 2.f - 100.f, CGRectGetMinY(self.recordButton.frame) + (100.f - 200.f)/2.0, 200.f, 200.f);
    self.recorderWaterView.waveSize = CGSizeMake(200.f, 200.f);

    self.tipLabel.frame = CGRectMake(0, height - 240, width, 30);
    self.timeLabel.frame = CGRectMake(0, height - 340, width, 30);
    
    self.spectrumView.frame = CGRectMake(width / 2.f - 100.f, height - 440, 200, 50);
}

#pragma mark - getter 懒加载
- (SDRecorderButton *)recordButton {
    if (!_recordButton) {
        _recordButton = [[SDRecorderButton alloc] init];

        [_recordButton setBackgroundImage:[UIImage imageNamed:@"Recording-default"] forState:UIControlStateNormal];
        [_recordButton setBackgroundImage:[UIImage imageNamed:@"Recording"] forState:UIControlStateFocused];

        // 开始
        [_recordButton addTarget:self action:@selector(recordStart:) forControlEvents:UIControlEventTouchDown];
        // 取消
        [_recordButton addTarget:self action:@selector(recordCancel:) forControlEvents: UIControlEventTouchUpOutside];
        //完成
        [_recordButton addTarget:self action:@selector(recordFinish:) forControlEvents:UIControlEventTouchUpInside];
        //移出
        [_recordButton addTarget:self action:@selector(recordTouchDragExit:) forControlEvents:UIControlEventTouchDragExit];
        //移入
        [_recordButton addTarget:self action:@selector(recordTouchDragEnter:) forControlEvents:UIControlEventTouchDragEnter];
        
        _recordButton.tDelegate = self;
    }
    return _recordButton;
}

- (SDRecorderWaterView *)recorderWaterView {
    if (!_recorderWaterView) {
        _recorderWaterView = [[SDRecorderWaterView alloc] initWithFrame:CGRectZero];
        _recorderWaterView.backgroundColor = [UIColor clearColor];
        _recorderWaterView.waveCount = 3;
        _recorderWaterView.waveSize = CGSizeMake(200.0, 200.0);
        _recorderWaterView.waveColor = [UIColor colorWithRed:212.0/255.0 green:112.0/255.0 blue:112.0/255.0 alpha:1.0];
    }
    return _recorderWaterView;
}

- (SDRecorderSpectrumView *)spectrumView {
    if (!_spectrumView) {
        _spectrumView = [[SDRecorderSpectrumView alloc] initWithFrame:CGRectZero];
        _spectrumView.itemWidth = 10;
        _spectrumView.itemHeight = 10;
        _spectrumView.itemColor = [UIColor colorWithRed:212.0/255.0 green:112.0/255.0 blue:112.0/255.0 alpha:1.0];
    }
    
    return _spectrumView;
}

- (UILabel *)tipLabel {
    if (!_tipLabel) {
        _tipLabel = [[UILabel alloc]init];
        _tipLabel.textColor = [UIColor lightGrayColor];
        _tipLabel.textAlignment = NSTextAlignmentCenter;
    }
    return _tipLabel;
}

- (UILabel *)timeLabel {
    if (!_timeLabel) {
        _timeLabel = [[UILabel alloc]init];
        _timeLabel.textColor = [UIColor lightGrayColor];
        _timeLabel.textAlignment = NSTextAlignmentCenter;
    }
    return _timeLabel;
}

@end

五、小结

iOS开发-实现声音录制AVAudioRecorder及播放AVAudioPlayer播放音频,使用AudioSession设置category、录制AVAudioRecorder、播放AVAudioPlayer功能。实现水波纹效果动画、音频频谱动画变化效果。

学习记录,每天不停进步。

你可能感兴趣的:(iphone开发,移动开发,Objective-c,ios,音视频,录制,播放,动画,AVPlayer,AVAudioSession)