Android Audio整体框架层级:
Android APP应用层->framework->JNI->Library->HAL->linux driver
<1> : 首先了解Android Audio所涉及的代码放在源代码位置:
对应应用层:~\frameworks\base\media\java\android\media
对应的中间层:包括framework,libarary代码:~\frameworks\av\media和~\frameworks\av\services
对应HAL层:~\hardware\libhardware_legacy\audio和~\hardware\libhardware,如果是定制的可能还有:~\hardware\imx\alsa
<2> : 按照上面的文件目录简说:
对应应用层 :
<1> : 主要是MediaPlayer,MediaRecord这两个类对于很多没什么特别要求的APP的开发者使用的比较多,算了把SoundPool播放也放在这一个级别吧.所以这几个算是音频录播处理的高级类了.
<2> : AudioTrack,AudioRecord这两个类相对于Media***来说相对低级一点,个人感觉只是觉得这两个类的初始化方式和参数配置和jni层基本上差不多,可能大家感觉它们和下面更亲一些吧.但是看了网上面说MediaPlayer最终是有AudioTrack进行播放的,这个地方需要注意,这个AudioTrack和应用层的AudioTrack不是一个概念.
<3> : AudioService,AudioManager,平时用了就知道这两个用于设置和管理的,比如音量volume,mute,Stream mode/type,这个地方记住两个字 : 策略者
对应中间层:即framework/jni
android系统大部分应用层和framework层都借助了AIDL接口进行的.所以会看到很多什么大写字母I开头的,比如IAudioService之类的.
不过这里关于上面接口没什么好说的,只要会用AIDL的,随着程序接口查找就好了,个人觉得也不是什么重点!
这里需要关注两个核心模块:AudioFlinger和AudioPolicyService.
一般介绍:AudioPolicyService只策略者,AudioFlinger是执行者!!!这一句话要反复读3遍.AudioPolicyService是老大是leader,AudioFlinger是马仔,是苦逼的做具体事的.
a> : AudioPolicyService : 播放或者录音均需要设置volume,mute,stream mode/type,buffer size,切换录播设备等,这些都得有它决定,他决定以后再通知AudioFlinger那边照着做.
b> : AudioFlinger : 这是一个真正进行播放录音的具体实施的模块:具体工作用一句话概括:听从AudioPolicyService的指挥,将音频数据送到指定的device上面去播放/或者从指定的设备上面去获取音频数据.所以播放录音工作都是有AudioFlinger完成的.
同时这两个是单独的进程,当然和应用层那更加扯不上边了[从进程角度].
关于他们的启动:
随着系统启动:mediaserver是一个随着系统启动而跟随启动的模块!
mediaserver 服务干了什么:
代码:E:\liuzhibao\android\android\frameworks\av\media\mediaserver
查看它的makefile文件:
LOCAL_PATH:= $(call my-dir) include $(CLEAR_VARS) LOCAL_SRC_FILES:= \ main_mediaserver.cpp LOCAL_SHARED_LIBRARIES := \ libaudioflinger \ libcameraservice \ libmediaplayerservice \ libutils \ libbinder # FIXME The duplicate audioflinger is temporary LOCAL_C_INCLUDES := \ frameworks/av/media/libmediaplayerservice \ frameworks/av/services/audioflinger \ frameworks/av/services/camera/libcameraservice \ frameworks/native/services/audioflinger LOCAL_MODULE:= mediaserver include $(BUILD_EXECUTABLE)
很简单:看三个信息:
源代码:main_mediaserver.cpp
依赖库:libaudioflinger,linbinder,这两个是重点,关注libmediaplayerservice模块.
生成模块:mediaserver
这个里面有看头的只有源代码啦:
#define LOG_TAG "mediaserver" //#define LOG_NDEBUG 0 #include <binder/IPCThreadState.h> #include <binder/ProcessState.h> #include <binder/IServiceManager.h> #include <utils/Log.h> // from LOCAL_C_INCLUDES #include "AudioFlinger.h" #include "CameraService.h" #include "MediaPlayerService.h" #include "AudioPolicyService.h" using namespace android; int main(int argc, char** argv) { signal(SIGPIPE, SIG_IGN); sp<ProcessState> proc(ProcessState::self()); sp<IServiceManager> sm = defaultServiceManager(); ALOGI("ServiceManager: %p", sm.get()); AudioFlinger::instantiate(); MediaPlayerService::instantiate(); CameraService::instantiate(); AudioPolicyService::instantiate(); ProcessState::self()->startThreadPool(); IPCThreadState::self()->joinThreadPool(); }
我觉得是个人看到main(***)应该就有点兴奋,因为main出来的是可执行的,这就是为什么mediaserver可以随着系统启动了,你就当做他是一个软件(当然它的确是个软件),随着系统一起启动.
前面的sp<***>,我个人的理解是stronger pointer强指针,当然也有认为是smart pointer,觉得都可以.
实在不明白就当做这样操作:
sp<IServiceManager> sm = defaultServiceManager();
=>
IServiceManager* sm = defaultServiceManager();
系统启动了mediaserver,mediaserver执行:
AudioFlinger::instantiate(); MediaPlayerService::instantiate(); CameraService::instantiate(); AudioPolicyService::instantiate();我个人理解上面几句程序是这样的:启动上面四个服务,并且"挂起"这四个,这几个相当于处于伺服器状态[或者说ready状态],只要应用使用录播,对应调用的接口就挂钩到这几个模块上面来,然后接下来的事情就有这几个模块去处理.所以整个过程中应用只需要调用接口就好了,不用管如何录播具体过程的,只需要play data和获取data就完成了.
<3> 上面实例化了AudioFlinger 模块,
下面截取了这个模块对应makefile文件,看源码推荐明确模块后,再查看对应的makefile文件,查看源码,不然文件太多有点分不清!
include $(CLEAR_VARS) LOCAL_SRC_FILES:= \ AudioFlinger.cpp \ AudioMixer.cpp.arm \ AudioResampler.cpp.arm \ AudioPolicyService.cpp \ ServiceUtilities.cpp \ AudioResamplerCubic.cpp.arm \ AudioResamplerSinc.cpp.arm LOCAL_SRC_FILES += StateQueue.cpp # uncomment for debugging timing problems related to StateQueue::push() LOCAL_CFLAGS += -DSTATE_QUEUE_DUMP LOCAL_C_INCLUDES := \ $(call include-path-for, audio-effects) \ $(call include-path-for, audio-utils) # FIXME keep libmedia_native but remove libmedia after split LOCAL_SHARED_LIBRARIES := \ libaudioutils \ libcommon_time_client \ libcutils \ libutils \ libbinder \ libmedia \ libmedia_native \ libnbaio \ libhardware \ libhardware_legacy \ libeffects \ libdl \ libpowermanager LOCAL_STATIC_LIBRARIES := \ libscheduling_policy \ libcpustats \ libmedia_helper LOCAL_MODULE:= libaudioflinger
源码AudioFlinger.cpp是其中最重要的,
AudioFlinger::AudioFlinger() : BnAudioFlinger(), mPrimaryHardwareDev(NULL), mHardwareStatus(AUDIO_HW_IDLE), mMasterVolume(1.0f), mMasterMute(false), mNextUniqueId(1), mMode(AUDIO_MODE_INVALID), mBtNrecIsOff(false) { }
构造函数是空的,初始化了几个参数对象.
看一下它是如何创建播放对象的:
sp<IAudioTrack> AudioFlinger::createTrack( pid_t pid, audio_stream_type_t streamType, uint32_t sampleRate, audio_format_t format, audio_channel_mask_t channelMask, int frameCount, IAudioFlinger::track_flags_t flags, const sp<IMemory>& sharedBuffer, audio_io_handle_t output, pid_t tid, int *sessionId, status_t *status)
函数里面的播放线程:首先检查是否已经存在了
PlaybackThread *thread = checkPlaybackThread_l(output);
创建:
track = thread->createTrack_l(client, streamType, sampleRate, format, channelMask, frameCount, sharedBuffer, lSessionId, flags, tid, &lStatus);
下面看一下AudioFlinger.h头文件:
播放:播放是分配一个线程去播放的,当然录音也是一样:
// --- PlaybackThread --- class PlaybackThread : public ThreadBase {
再看一下继承关系:
class ThreadBase : public Thread {
下面是录制:
// record thread class RecordThread : public ThreadBase, public AudioBufferProvider // derives from AudioBufferProvider interface for use by resampler {
当有多线播音的时候,就会有处理混音的过程:
class MixerThread : public PlaybackThread { public: MixerThread (const sp<AudioFlinger>& audioFlinger, AudioStreamOut* output, audio_io_handle_t id, audio_devices_t device, type_t type = MIXER);
混音最多支持32路,混音.
当然可以想象,如果有多路,那么就有很多个output/input输出输入,那么就会不断产生对应的每一个录播线程,那么就需要管理这些被创建的线程.
接着就产生了,下面的不需要多路混音
class DirectOutputThread : public PlaybackThread
当多路需要混音的时候:
class DuplicatingThread : public MixerThread
从上面创建的整个过程,APP new了一个AudioTrack就相当于创建了一个新的播放线程,如果有多个,那么就需要上面的进行管理.
事件event处理:
void AudioFlinger::ThreadBase::sendIoConfigEvent(int event, int param) { Mutex::Autolock _l(mLock); sendIoConfigEvent_l(event, param); } // sendIoConfigEvent_l() must be called with ThreadBase::mLock held void AudioFlinger::ThreadBase::sendIoConfigEvent_l(int event, int param) { IoConfigEvent *ioEvent = new IoConfigEvent(event, param); mConfigEvents.add(static_cast<ConfigEvent *>(ioEvent)); ALOGV("sendIoConfigEvent() num events %d event %d, param %d", mConfigEvents.size(), event, param); mWaitWorkCV.signal(); } // sendPrioConfigEvent_l() must be called with ThreadBase::mLock held void AudioFlinger::ThreadBase::sendPrioConfigEvent_l(pid_t pid, pid_t tid, int32_t prio) { PrioConfigEvent *prioEvent = new PrioConfigEvent(pid, tid, prio); mConfigEvents.add(static_cast<ConfigEvent *>(prioEvent)); ALOGV("sendPrioConfigEvent_l() num events %d pid %d, tid %d prio %d", mConfigEvents.size(), pid, tid, prio); mWaitWorkCV.signal(); }
通过调用send*Config*函数进行变更配置信息,播放线程有专门的处理这类event的函数:
void AudioFlinger::ThreadBase::processConfigEvents() { mLock.lock(); while (!mConfigEvents.isEmpty()) { ALOGV("processConfigEvents() remaining events %d", mConfigEvents.size()); ConfigEvent *event = mConfigEvents[0]; mConfigEvents.removeAt(0); // release mLock before locking AudioFlinger mLock: lock order is always // AudioFlinger then ThreadBase to avoid cross deadlock mLock.unlock(); switch(event->type()) { case CFG_EVENT_PRIO: { PrioConfigEvent *prioEvent = static_cast<PrioConfigEvent *>(event); int err = requestPriority(prioEvent->pid(), prioEvent->tid(), prioEvent->prio()); if (err != 0) { ALOGW("Policy SCHED_FIFO priority %d is unavailable for pid %d tid %d; error %d", prioEvent->prio(), prioEvent->pid(), prioEvent->tid(), err); } } break; case CFG_EVENT_IO: { IoConfigEvent *ioEvent = static_cast<IoConfigEvent *>(event); mAudioFlinger->mLock.lock(); audioConfigChanged_l(ioEvent->event(), ioEvent->param()); mAudioFlinger->mLock.unlock(); } break; default: ALOGE("processConfigEvents() unknown event type %d", event->type()); break; } delete event; mLock.lock(); } mLock.unlock(); }
事件是谁发起的,最终最根本是AudioPolicyService发起的.
AudioPolicyService后面接着看!