开发过程中录音和播放这块碰到了一些问题,麻烦的主要有三个:
检测是否有声音输入设备
当有多个声音输出设备时,指定声音输出设备
检测耳机的插入和拔出
第一个问题,对于iTouch和iPad等本身不带麦克风的设备,需要检查是否插入了带录音功能的耳机;对于iphone,由于其本身已近自带麦克风,所以相对容易。第二个问题,当在本身带有外放的设备上插入耳机等输出设备时,就出现了多个输出设备,需要实现在程序中指定将声音输出到哪里。第三个问题,插入/拔出耳机必然引起声音输出设备的变化,而如果是在iTouch和iPad上插入/拔出了带麦克风的耳机,则必然引起声音输入设备的变化。
1. 检测声音输入设备
- (BOOL)hasMicphone {
return [[AVAudioSession sharedInstance] inputIsAvailable];
}
2. 检测声音输出设备
对于输出设备的检测,我们只考虑了2个情况,一种是设备自身的外放(iTouch/iPad/iPhone都有),一种是当前是否插入了带外放的耳机。iOS已经提供了相关方法用于获取当前的所有声音设备,我们只需要检查在这些设备中是否存在我们所关注的那几个就可以了。
获取当前所有声音设备:
CFStringRef route;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &route);
在iOS上所有可能的声音设备包括:
/* Known values of route:
* "Headset"
* "Headphone"
* "Speaker"
* "SpeakerAndMicrophone"
* "HeadphonesAndMicrophone"
* "HeadsetInOut"
* "ReceiverAndMicrophone"
* "Lineout"
*/
每一项的具体代表的设备请查考iOS文档,此处我们关注的是是否有耳机,所以只需要检查在route中是否有Headphone或Headset存在,具体方法如下:
- (BOOL)hasHeadset {
#if TARGET_IPHONE_SIMULATOR
#warning *** Simulator mode: audio session code works only on a device
return NO;
#else
CFStringRef route;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, &route);
if((route == NULL) || (CFStringGetLength(route) == 0)){
// Silent Mode
NSLog(@"AudioRoute: SILENT, do nothing!");
} else {
NSString* routeStr = (NSString*)route;
NSLog(@"AudioRoute: %@", routeStr);
/* Known values of route:
* "Headset"
* "Headphone"
* "Speaker"
* "SpeakerAndMicrophone"
* "HeadphonesAndMicrophone"
* "HeadsetInOut"
* "ReceiverAndMicrophone"
* "Lineout"
*/
NSRange headphoneRange = [routeStr rangeOfString : @"Headphone"];
NSRange headsetRange = [routeStr rangeOfString : @"Headset"];
if (headphoneRange.location != NSNotFound) {
return YES;
} else if(headsetRange.location != NSNotFound) {
return YES;
}
}
return NO;
#endif
}
请注意,由于获取AudioRoute的相关方法不能再simulator上运行(会直接crush),所以必须先行处理。
3. 设置声音输出设备
在我们的项目中,存在当正在播放时用户会插入或拔出耳机的情况。如果是播放时用户插入了耳机,苹果会自动将声音输出指向到耳机并自动将音量调整为合适大小;如果是在用耳机的播放过程中用户拔出了耳机,声音会自动从设备自身的外放里面播出,但是其音量并不会自动调大。
经过我们的测试,我们发现当播放时拔出耳机会有两个问题(也许对你来说不是问题,但是会影响我们的app):
音乐播放自动停止
声音音量大小不会自动变大,系统仍然以较小的声音(在耳机上合适的声音)来进行外放
对于第一个问题,实际上就是需要能够检测到耳机拔出的事件;而第二个问题则是需要当耳机拔出时强制设置系统输出设备修改为系统外放。
强制修改系统声音输出设备:
- (void)resetOutputTarget {
BOOL hasHeadset = [self hasHeadset];
NSLog (@"Will Set output target is_headset = %@ .", hasHeadset ? @"YES" : @"NO");
UInt32 audioRouteOverride = hasHeadset ?
kAudioSessionOverrideAudioRoute_None:kAudioSessionOverrideAudioRoute_Speaker;
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
}
可以看到我们修改了AudioSession的属性“kAudioSessionProperty_OverrideAudioRoute”,该属性在iOS文档上的解释如下:
kAudioSessionProperty_OverrideAudioRouteSpecifies whether or not to override the audio session category’s normal audio route. Can be set with one of two values:kAudioSessionOverrideAudioRoute_None, which specifies that you want to use the normal audio route; andkAudioSessionOverrideAudioRoute_Speaker, when sends output audio to the speaker. A write-onlyUInt32value.
Upon an audio route change (such as by plugging in or unplugging a headset), or upon interruption, this property reverts to its default value. This property can be used only with thekAudioSessionCategory_PlayAndRecord(or the equivalentAVAudioSessionCategoryRecord) category.
可以看到,该属性只有当category为kAudioSessionCategory_PlayAndRecord或者AVAudioSessionCategoryRecord时才能使用。所以我们还需要能够设置AudioSession的category。
4. 设置Audio工作模式(category,我当做工作模式理解的)
iOS系统中Audio支持多种工作模式(category),要实现某个功能,必须首先将AudioSession设置到支持该功能的工作模式下。所有支持的工作模式如下:
Audio Session Categories
Category identifiersforaudio sessions, used as valuesforthe setCategory:error: method.
NSString *constAVAudioSessionCategoryAmbient;
NSString *constAVAudioSessionCategorySoloAmbient;
NSString *constAVAudioSessionCategoryPlayback;
NSString *constAVAudioSessionCategoryRecord;
NSString *constAVAudioSessionCategoryPlayAndRecord;
NSString *constAVAudioSessionCategoryAudioProcessing;
具体每一个category的功能请参考iOS文档,其中AVAudioSessionCategoryRecord为独立录音模式,而AVAudioSessionCategoryPlayAndRecord为支持录音盒播放的模式,而AVAudioSessionCategoryPlayback为普通播放模式。
设置category:
- (BOOL)checkAndPrepareCategoryForRecording {
recording = YES;
BOOL hasMicphone = [self hasMicphone];
NSLog(@"Will Set category for recording! hasMicophone = %@", hasMicphone?@"YES":@"NO");
if(hasMicphone) {
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord
error:nil];
}
[self resetOutputTarget];
returnhasMicphone;
}
- (void)resetCategory {
if(!recording) {
NSLog(@"Will Set category to static value = AVAudioSessionCategoryPlayback!");
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback
error:nil];
}
}
5. 检测耳机插入/拔出事件
耳机插入拔出事件是通过监听AudioSession的RouteChange事件然后判断耳机状态实现的。实现步骤分为两步,首先注册监听函数,然后再监听函数中判断耳机状态。
注册监听函数:
AudioSessionAddPropertyListener (kAudioSessionProperty_AudioRouteChange,
audioRouteChangeListenerCallback,
self);
我们的需求是当耳机被插入或拔出时做出响应,而产生AouteChange事件的原因有多种,所以需要对各种类型进行处理并结合当前耳机状态进行判断。在iOS文档中,产生AouteChange事件的原因有如下几种:
Audio Session Route Change Reasons
Identifiersforthe various reasons that an audio route can changewhileyour iOS application is running.
enum{
kAudioSessionRouteChangeReason_Unknown =0,
kAudioSessionRouteChangeReason_NewDeviceAvailable =1,
kAudioSessionRouteChangeReason_OldDeviceUnavailable =2,
kAudioSessionRouteChangeReason_CategoryChange =3,
kAudioSessionRouteChangeReason_Override =4,
// this enum has no constant with a value of 5
kAudioSessionRouteChangeReason_WakeFromSleep =6,
kAudioSessionRouteChangeReason_NoSuitableRouteForCategory =7
};
具体每个类型的含义请查阅iOS文档,其中我们关注的是kAudioSessionRouteChangeReason_NewDeviceAvailable有新设备插入、kAudioSessionRouteChangeReason_OldDeviceUnavailable原有设备被拔出以及kAudioSessionRouteChangeReason_NoSuitableRouteForCategory当前工作模式缺少合适设备。
当有新设备接入时,如果检测到耳机,则判定为耳机插入事件;当原有设备移除时,如果无法检测到耳机,则判定为耳机拔出事件;当出现“当前工作模式缺少合适设备时”,直接判定为录音时拔出了麦克风。
很明显,这个判定逻辑实际上不准确,比如原来就有耳机但是插入了一个新的audio设备或者是原来就没有耳机但是拔出了一个原有的audio设备,我们的判定都会出错。但是对于我们的项目来说,其实关注的不是耳机是拔出还是插入,真正关注的是有audio设备插入/拔出时能够根据当前耳机/麦克风状态去调整设置,所以这个判定实现对我们来说是正确的。
监听函数的实现:
void audioRouteChangeListenerCallback (
void *inUserData,
AudioSessionPropertyID inPropertyID,
UInt32 inPropertyValueSize,
const void *inPropertyValue
) {
if (inPropertyID != kAudioSessionProperty_AudioRouteChange) return;
// Determines the reason for the route change, to ensure that it is not
// because of a category change.
CFDictionaryRef routeChangeDictionary = inPropertyValue;
CFNumberRef routeChangeReasonRef =
CFDictionaryGetValue (routeChangeDictionary,
CFSTR (kAudioSession_AudioRouteChangeKey_Reason));
SInt32 routeChangeReason;
CFNumberGetValue (routeChangeReasonRef, kCFNumberSInt32Type, &routeChangeReason);
NSLog(@" ======================= RouteChangeReason : %d", routeChangeReason);
AudioHelper *_self = (AudioHelper *) inUserData;
if (routeChangeReason == kAudioSessionRouteChangeReason_OldDeviceUnavailable) {
[_self resetSettings];
if (![_self hasHeadset]) {
[[NSNotificationCenter defaultCenter] postNotificationName:@"ununpluggingHeadse
object:nil];
}
} else if (routeChangeReason == kAudioSessionRouteChangeReason_NewDeviceAvailable) {
[_self resetSettings];
if (![_self hasMicphone]) {
[[NSNotificationCenter defaultCenter] postNotificationName:@"pluggInMicrophone"
object:nil];
}
} else if (routeChangeReason == kAudioSessionRouteChangeReason_NoSuitableRouteForCategory) {
[_self resetSettings];
[[NSNotificationCenter defaultCenter] postNotificationName:@"lostMicroPhone"
object:nil];
}
//else if (routeChangeReason == kAudioSessionRouteChangeReason_CategoryChange ) {
// [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
//}
[_self printCurrentCategory];
}
当检测到相关事件后,通过NSNotificationCenter通知observers耳机(有无麦克风)拔出/插入事件拔出事件,从而触发相关操作。
6. 事件处理
对于耳机(有无麦克风)拔出/插入事件,一般需要做如下处理:
强制重设系统声音输出设备(防止系统以较小声音在外放中播放)
如果拔出前正在播放,则启动已经暂停的播放(当耳机拔出时,系统会自动暂停播放)
当拔出前正在录音,则需要检查麦克风情况并决定是否停止录音(如果录音时从iTouch/iPad等设备上拔出了带麦克风的耳机)
完整代码
AudioHelper.h
[java]view plaincopy
#import
@interfaceAudioHelper : NSObject {
BOOL recording;
}
- (void)initSession;
- (BOOL)hasHeadset;
- (BOOL)hasMicphone;
- (void)cleanUpForEndRecording;
- (BOOL)checkAndPrepareCategoryForRecording;
@end
AudioHelper.m
#import "AudioHelper.h"
#import
#import
@implementation AudioHelper
- (BOOL)hasMicphone {
return [[AVAudioSession sharedInstance] inputIsAvailable];
}
- (BOOL)hasHeadset {
#if TARGET_IPHONE_SIMULATOR
#warning *** Simulator mode: audio session code works only on a device
return NO;
#else
CFStringRef route;
UInt32 propertySize = sizeof(CFStringRef);
AudioSessionGetProperty(kAudioSessionProperty_AudioRoute, &propertySize, ute);
if((route == NULL) || (CFStringGetLength(route) == 0)){
// Silent Mode
NSLog(@"AudioRoute: SILENT, do nothing!");
} else {
NSString* routeStr = (NSString*)route;
NSLog(@"AudioRoute: %@", routeStr);
/* Known values of route:
* "Headset"
* "Headphone"
* "Speaker"
* "SpeakerAndMicrophone"
* "HeadphonesAndMicrophone"
* "HeadsetInOut"
* "ReceiverAndMicrophone"
* "Lineout"
*/
NSRange headphoneRange = [routeStr rangeOfString : @"Headphone"];
NSRange headsetRange = [routeStr rangeOfString : @"Headset"];
if (headphoneRange.location != NSNotFound) {
return YES;
} else if(headsetRange.location != NSNotFound) {
return YES;
}
}
return NO;
#endif
}
- (void)resetOutputTarget {
BOOL hasHeadset = [self hasHeadset];
NSLog (@"Will Set output target is_headset = %@ .", hasHeadset ? @"YES" : @"NO");
UInt32 audioRouteOverride = hasHeadset ?
kAudioSessionOverrideAudioRoute_None:kAudioSessionOverrideAudioRoute_Sper;
AudioSessionSetProperty(kAudioSessionProperty_OverrideAudioRoute, sizeof(audioRouteOverride), &audioRouteOverride);
[self hasHeadset];
}
- (BOOL)checkAndPrepareCategoryForRecording {
recording = YES;
BOOL hasMicphone = [self hasMicphone];
NSLog(@"Will Set category for recording! hasMicophone = %@", Micphone?@"YES":@"NO");
if (hasMicphone) {
[[AVAudioSession sharedInstance] Category:AVAudioSessionCategoryPlayAndRecord
error:nil];
}
[self resetOutputTarget];
return hasMicphone;
}
- (void)resetCategory {
if (!recording) {
NSLog(@"Will Set category to static value = udioSessionCategoryPlayback!");
[[AVAudioSession sharedInstance] Category:AVAudioSessionCategoryPlayback
error:nil];
}
}
- (void)resetSettings {
[self resetOutputTarget];
[self resetCategory];
BOOL isSucced = [[AVAudioSession sharedInstance] setActive: YES error:NULL];
if (!isSucced) {
NSLog(@"Reset audio session settings failed!");
}
}
- (void)cleanUpForEndRecording {
recording = NO;
[self resetSettings];
}
- (void)printCurrentCategory {
return;
UInt32 audioCategory;
UInt32 size = sizeof(audioCategory);
AudioSessionGetProperty(kAudioSessionProperty_AudioCategory, &size, dioCategory);
if ( audioCategory == kAudioSessionCategory_UserInterfaceSoundEffects ){
NSLog(@"current category is : dioSessionCategory_UserInterfaceSoundEffects");
} else if ( audioCategory == kAudioSessionCategory_AmbientSound ){
NSLog(@"current category is : kAudioSessionCategory_AmbientSound");
} else if ( audioCategory == kAudioSessionCategory_AmbientSound ){
NSLog(@"current category is : kAudioSessionCategory_AmbientSound");
} else if ( audioCategory == kAudioSessionCategory_SoloAmbientSound ){
NSLog(@"current category is : kAudioSessionCategory_SoloAmbientSound");
} else if ( audioCategory == kAudioSessionCategory_MediaPlayback ){
NSLog(@"current category is : kAudioSessionCategory_MediaPlayback");
} else if ( audioCategory == kAudioSessionCategory_LiveAudio ){
NSLog(@"current category is : kAudioSessionCategory_LiveAudio");
} else if ( audioCategory == kAudioSessionCategory_RecordAudio ){
NSLog(@"current category is : kAudioSessionCategory_RecordAudio");
} else if ( audioCategory == kAudioSessionCategory_PlayAndRecord ){
NSLog(@"current category is : kAudioSessionCategory_PlayAndRecord");
} else if ( audioCategory == kAudioSessionCategory_AudioProcessing ){
NSLog(@"current category is : kAudioSessionCategory_AudioProcessing");
} else {
NSLog(@"current category is : unknow");
}
}
void audioRouteChangeListenerCallback (
void *inUserData,
AudioSessionPropertyID inPropertyID,
UInt32 inPropertyValueS,
const void *inPropertyValue
) {
if (inPropertyID != kAudioSessionProperty_AudioRouteChange) return;
// Determines the reason for the route change, to ensure that it is not
// because of a category change.
CFDictionaryRef routeChangeDictionary = inPropertyValue;
CFNumberRef routeChangeReasonRef =
CFDictionaryGetValue (routeChangeDictionary,
CFSTR (kAudioSession_AudioRouteChangeKey_Reason));
SInt32 routeChangeReason;
CFNumberGetValue (routeChangeReasonRef, kCFNumberSInt32Type, uteChangeReason);
NSLog(@" ===================================== RouteChangeReason : %d", teChangeReason);
AudioHelper *_self = (AudioHelper *) inUserData;
if (routeChangeReason == kAudioSessionRouteChangeReason_OldDeviceUnavailable)
[_self resetSettings];
if (![_self hasHeadset]) {
[[NSNotificationCenter defaultCenter] tNotificationName:@"ununpluggingHeadse"
object:nil];
}
} else if (routeChangeReason == dioSessionRouteChangeReason_NewDeviceAvailable) {
[_self resetSettings];
if (![_self hasMicphone]) {
[[NSNotificationCenter defaultCenter] tNotificationName:@"pluggInMicrophone"
object:nil];
}
} else if (routeChangeReason == dioSessionRouteChangeReason_NoSuitableRouteForCategory) {
[_self resetSettings];
[[NSNotificationCenter defaultCenter] postNotificationName:@"lostMicroPhone"
object:nil];
}
//else if (routeChangeReason == kAudioSessionRouteChangeReason_CategoryChange )
// [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
//}
[_self printCurrentCategory];
}
- (void)initSession {
recording = NO;
AudioSessionInitialize(NULL, NULL, NULL, NULL);
[self resetSettings];
AudioSessionAddPropertyListener (kAudioSessionProperty_AudioRouteChange,
audioRouteChangeListenerCallback,
self);
[self printCurrentCategory];
[[AVAudioSession sharedInstance] setActive: YES error:NULL];
}
- (void)dealloc {
[super dealloc];
}
@end