跟YUV数据一样,PCM数据也是裸数据流,里面不包含存储格式、通道数、采样率等信息,我们只有知道这些信息,才能正常播放PCM数据,在AudioToolbox中使用AudioStreamBasicDescription结构体来描述这些参数。AudioStreamBasicDescription结构体定义如下:
struct AudioStreamBasicDescription
{
Float64 mSampleRate;//采样率
AudioFormatID mFormatID;//数据格式
AudioFormatFlags mFormatFlags;//数据的存储方式
UInt32 mBytesPerPacket;//每个packet中有几个字节
UInt32 mFramesPerPacket;//每个packer中有几个frame
UInt32 mBytesPerFrame;//每个frame中有几个字节
UInt32 mChannelsPerFrame;//每个frame中有几个通道
UInt32 mBitsPerChannel;//每个通道有多少位
UInt32 mReserved;
};
我用到的PCM数据格式是s16le,通道数为2,采样率是44.1K。因此我构建的AudioStreamBasicDescription结构体如下:
int channels = 2;
AudioStreamBasicDescription audioStreamBasicDescription;
audioStreamBasicDescription.mSampleRate = 44100;
audioStreamBasicDescription.mFormatID = kAudioFormatLinearPCM;
audioStreamBasicDescription.mFormatFlags = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;
audioStreamBasicDescription.mFramesPerPacket = 1;
audioStreamBasicDescription.mChannelsPerFrame = channels;
audioStreamBasicDescription.mBitsPerChannel = 16;
audioStreamBasicDescription.mBytesPerPacket = channels*2;
audioStreamBasicDescription.mBytesPerFrame = channels*2;
构建好这些音频参数后,我们就来使用AudioUnit一步一步来播放PCM数据。
1、初始化AudioUnit
- (void)setupAudioUnit {
AudioComponentDescription audioDesc;
audioDesc.componentType = kAudioUnitType_Output;
audioDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioDesc.componentFlags = 0;
audioDesc.componentFlagsMask = 0;
AudioComponent inputComponent = AudioComponentFindNext(NULL, &audioDesc);
AudioComponentInstanceNew(inputComponent, &_audioUnit);
}
2、设置AudioUnit属性
- (void)setupAudioUnitProperty:(AudioStreamBasicDescription)audioStreamBasicDescription {
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &audioStreamBasicDescription, sizeof(audioStreamBasicDescription));
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = (AURenderCallback)AudioUnitPlayer_AURenderCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self);
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, &callbackStruct, sizeof(callbackStruct));
}
这一步,我们把之前构建好的AudioStreamBasicDescription音频参数告诉AudioUnit,这样它就知道怎么去解析播放PCM数据了。我们还设置了一个回调方法AudioUnitPlayer_AURenderCallback,播放音频时,音频设备需要数据了,设备就是通过这个回调方法告诉我们,它没有PCM数据了,需要我们传输数据给它了。是一个被动的传输过程,而不是我们主动把数据喂给设备。
3、定义AudioUnitPlayer_AURenderCallback方法
static OSStatus AudioUnitPlayer_AURenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
memset(ioData->mBuffers[0].mData, 0, ioData->mBuffers[0].mDataByteSize);
AudioUnitPlayer *player = (__bridge AudioUnitPlayer *)inRefCon;
NSData *data = [player.fileHandle readDataOfLength:ioData->mBuffers[0].mDataByteSize];
if (data.lengthmBuffers[0].mDataByteSize) {
if (@available(iOS 13.0, *)) {
[player.fileHandle seekToOffset:0 error:nil];
} else {
[player.fileHandle seekToFileOffset:0];
}
[player stopPlayer];
return noErr;
}
memcpy(ioData->mBuffers[0].mData, data.bytes, data.length);
return noErr;
}
可以看到,我们就是通过这个方法把PCM数据传输给音频设备的。
4、开始播放或暂停
- (void)startPlayer {
AudioOutputUnitStart(self.audioUnit);
}
- (void)stopPlayer {
AudioOutputUnitStop(self.audioUnit);
}
完整代码如下:
.h文件
#import
#import
NS_ASSUME_NONNULL_BEGIN
@interface AudioUnitPlayer : NSObject
+ (instancetype)player:(AudioStreamBasicDescription)audioDesc pcmPath:(NSString *)pcmPath;
- (void)startPlayer;
- (void)stopPlayer;
@end
NS_ASSUME_NONNULL_END
.m文件
#import "AudioUnitPlayer.h"
@interface AudioUnitPlayer ()
@property (nonatomic, assign) AudioUnit audioUnit;
@property (nonatomic, strong) NSFileHandle *fileHandle;
@end
@implementation AudioUnitPlayer
+ (instancetype)player:(AudioStreamBasicDescription)audioStreamBasicDescription pcmPath:(NSString *)pcmPath
{
AudioUnitPlayer *obj = [[AudioUnitPlayer alloc] init:audioStreamBasicDescription pcmPath:pcmPath];
return obj;
}
- (instancetype)init:(AudioStreamBasicDescription)audioStreamBasicDescription pcmPath:(NSString *)pcmPath
{
self = [super init];
if (self) {
[self setupAudioUnit];
[self setupAudioUnitProperty:audioStreamBasicDescription];
self.fileHandle = [NSFileHandle fileHandleForReadingAtPath:pcmPath];
}
return self;
}
- (void)setupAudioUnit {
AudioComponentDescription audioDesc;
audioDesc.componentType = kAudioUnitType_Output;
audioDesc.componentSubType = kAudioUnitSubType_RemoteIO;
audioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
audioDesc.componentFlags = 0;
audioDesc.componentFlagsMask = 0;
AudioComponent inputComponent = AudioComponentFindNext(NULL, &audioDesc);
AudioComponentInstanceNew(inputComponent, &_audioUnit);
}
- (void)setupAudioUnitProperty:(AudioStreamBasicDescription)audioStreamBasicDescription {
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &audioStreamBasicDescription, sizeof(audioStreamBasicDescription));
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = (AURenderCallback)AudioUnitPlayer_AURenderCallback;
callbackStruct.inputProcRefCon = (__bridge void * _Nullable)(self);
AudioUnitSetProperty(self.audioUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, 0, &callbackStruct, sizeof(callbackStruct));
}
static OSStatus AudioUnitPlayer_AURenderCallback(void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
memset(ioData->mBuffers[0].mData, 0, ioData->mBuffers[0].mDataByteSize);
AudioUnitPlayer *player = (__bridge AudioUnitPlayer *)inRefCon;
NSData *data = [player.fileHandle readDataOfLength:ioData->mBuffers[0].mDataByteSize];
if (data.lengthmBuffers[0].mDataByteSize) {
if (@available(iOS 13.0, *)) {
[player.fileHandle seekToOffset:0 error:nil];
} else {
[player.fileHandle seekToFileOffset:0];
}
[player stopPlayer];
return noErr;
}
memcpy(ioData->mBuffers[0].mData, data.bytes, data.length);
return noErr;
}
- (void)startPlayer {
AudioOutputUnitStart(self.audioUnit);
}
- (void)stopPlayer {
AudioOutputUnitStop(self.audioUnit);
}
@end