首先声明测试平台为瑞芯微的rk3168,Android4.2.2,Android版本很重要,因为Android4.0和Android4.2.2的代码有些地方就有区别,并不通用!
首先接到任务不知如何下手,因为我了解中Android启动时没有出现过播放视频的,特别是在启动动画之前,Linux企鹅之后,而动画也是一幅幅的图片,根本不是啥视频如mp4,3gp等!
因为启动动画时可以播放声音,而且上层应用的mediaplayer也可以播放声音,结合surface可以播放视频!这就是入口点~
动画播放的重要文件在:Z:\Backup\rk3168_v4.2\frameworks\base\cmds\bootanimation下!
1、了解Android显示开机画面的原理!
1. Linux 系统启动,出现Linux小企鹅画面(reboot)(Android 1.5及以上版本已经取消加载图片);
2. Android平台启动初始化,出现"A N D R I O D"文字字样画面;
3. Android平台图形系统启动,出现含闪动的ANDROID字样的动画图片(start)。
4、原理参考资料:
http://blog.csdn.net/luoshengyang/article/details/7691321
http://blog.csdn.net/conowen/article/details/7884009
http://www.cnblogs.com/jqyp/archive/2012/03/07/2383973.html
http://blog.csdn.net/backgarden_straw/article/details/8571992
http://www.eoeandroid.com/thread-114742-1-1.html
1、播放音乐:
在瑞芯微提供的源码中其实可以支持播放音乐了,只是没有提供音乐文件!声音移植
a、首先在BootAnimation.h添加方法的声明和头文件的引用
1
2
|
#include <media/AudioSystem.h>
#include <media/mediaplayer.h>
|
b、在 class BootAnimation : public Thread, public IBinder::DeathRecipient中
添加方法 voidplayMusic();
c、然后在BootAnimation.cpp中实现这个方法:
#define BOOTMUSIC_FILE "/system/media/audio/alarms/gx.mp4" void BootAnimation::playMusic() { sp<MediaPlayer> mp = new MediaPlayer(); if ((0 == access(BOOTMUSIC_FILE, F_OK)) && mp != NULL) { mp->setDataSource(BOOTMUSIC_FILE, NULL) //设置资源 mp->prepare(); //准备,同步 mp->start(); //播放 } //其实mediaplayer还有 很多方法,可以查看mediaplayer类 }
d、调用并启动声音
bool BootAnimation::threadLoop() { bool r; playMusic() if (mAndroidAnimation) { r = android(); } else { r = movie(); } ....... }
e、Android.mk的修改
因为播放声音还需要引入库
LOCAL_SHARED_LIBRARIES := \ libcutils \ libandroidfw \ libutils \ libbinder \ libui \ libskia \ libEGL \ libGLESv1_CM \ libmedia \ libgui
请注意,libmedia是新添加的;
f、声音文件的添加
源码目录在声音在:
Z:\source\rk3168_v4.2\frameworks\base\data\sounds
视频在:
Z:\source\rk3168_v4.2\frameworks\base\data\videos
至于编译完成后放到什么地方了那是AllAudio.mk文件上配置的
$(LOCAL_PATH)/XXXX.mp3:system/etc/xxxx.mp3 \
g、将音频放到此目录编译,就有开机声音了!
2、视频移植
a、其实看了前面的资料了解动画的启动过程和对上层应用播放视频的方法其实很简单了!
其实动画的播放也是用surface来展示的,然后用OpenGL将图片绘制上去的!
联系上层应用播放视频的步骤(mediaplayer+surfaceview):
所以如果你做了声音播放后是不是很自然想到,就差player.setDisplay(surfaceHolder);player=new MediaPlayer(); player.setAudioStreamType(AudioManager.STREAM_MUSIC); player.setDisplay(surfaceHolder); player.setDataSource("/sdcard/gx.mp4"); player.prepare(); player.start(); 。。。。。。。
既然如此就倒推这方法做了什么:
因为setdisplay是mediaplayer的方法
base\media\java\android\media\mediaplayer.Java--->
public void setDisplay(SurfaceHolder sh) { mSurfaceHolder = sh; Surface surface; if (sh != null) { surface = sh.getSurface(); } else { surface = null; } _setVideoSurface(surface);//重点设置一个surface updateSurfaceScreenOn(); }
private native void _setVideoSurface(Surface surface); //本地方法 base\media\jni\android_media_MediaPlayer.cpp -----> {"_setVideoSurface", "(Landroid/view/Surface;)V", (void *)android_media_MediaPlayer_setVideoSurface}, ---->android_media_MediaPlayer_setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface) { setVideoSurface(env, thiz, jsurface, true /* mediaPlayerMustBeAlive */); } ------------> static void setVideoSurface(JNIEnv *env, jobject thiz, jobject jsurface, jboolean mediaPlayerMustBeAlive) { sp<MediaPlayer> mp = getMediaPlayer(env, thiz); if (mp == NULL) { if (mediaPlayerMustBeAlive) { jniThrowException(env, "java/lang/IllegalStateException", NULL); } return; } decVideoSurfaceRef(env, thiz); sp<ISurfaceTexture> new_st; if (jsurface) { sp<Surface> surface(android_view_Surface_getSurface(env, jsurface)); if (surface != NULL) { new_st = surface->getSurfaceTexture(); if (new_st == NULL) { jniThrowException(env, "java/lang/IllegalArgumentException", "The surface does not have a binding SurfaceTexture!"); return; } new_st->incStrong(thiz); } else { jniThrowException(env, "java/lang/IllegalArgumentException", "The surface has been released"); return; } } env->SetIntField(thiz, fields.surface_texture, (int)new_st.get()); // This will fail if the media player has not been initialized yet. This // can be the case if setDisplay() on MediaPlayer.java has been called // before setDataSource(). The redundant call to setVideoSurfaceTexture() // in prepare/prepareAsync covers for this case. mp->setVideoSurfaceTexture(new_st); //重点 }
分析到这里我们心里基本有数了,就是要活的一个surface即可! b、分析动画中surface 的构建过程: 在base\cmds\bootanimation\BootAnimation.cpp的status_t BootAnimation::readyToRun()中有对surface的初始化,到了就完成了大半获取这个surface在调用上面的方法就大功告成!
c、主要函数
#define ANIMATION_BF_SIZE 1024*1024*3 #define TEM_VIDEO_PATH "/data/local/open.mp4" void BootAnimation::playMusic() { int length = 0; int fd; int ret = -1; sp<MediaPlayer> mp = new MediaPlayer(); if (mp != NULL) { char *bf = (char *)malloc(sizeof(char)*ANIMATION_BF_SIZE); if(bf == NULL){ return; } fd = open(TEM_VIDEO_PATH,O_RDWR|O_CREAT,0777); if(fd < 0){ ALOGD("hcm ++++++++++open++++++++++++failed++error = %d\n",errno); return; } ret = write(fd,bf,ANIMATION_BF_SIZE); if(ret < 0){ ALOGD("hcm ++++++++++write++++++++++++failed++error = %d\n",errno); return; } mp->setDataSource(fd, 0,ANIMATION_BF_SIZE); ALOGD("hcm ++++++++++playMusic++++++++++++\n"); #if 0 // set up the thread-pool sp<ProcessState> proc(ProcessState::self()); ProcessState::self()->startThreadPool(); // create a client to surfaceflinger sp<SurfaceComposerClient> client = new SurfaceComposerClient(); sp<SurfaceControl> surfaceControl = client->createSurface( getpid(), 0, 160, 240, PIXEL_FORMAT_RGB_565); SurfaceComposerClient::openGlobalTransaction(); surfaceControl->setLayer(100000); SurfaceComposerClient::closeGlobalTransaction(); // pretend it went cross-process Parcel parcel; SurfaceControl::writeSurfaceToParcel(surfaceControl, &parcel); parcel.setDataPosition(0); sp<Surface> surface = Surface::readFromParcel(parcel); //surface.get IPCThreadState::self()->joinThreadPool(); #else //mAssets.addDefaultAssets(); sp<IBinder> dtoken(SurfaceComposerClient::getBuiltInDisplay( ISurfaceComposer::eDisplayIdMain)); DisplayInfo dinfo; status_t status = SurfaceComposerClient::getDisplayInfo(dtoken, &dinfo); Rect layerStackRect(dinfo.h,dinfo.w) ; Rect displayRect(dinfo.h,dinfo.w) ; //SurfaceComposerClient::setOrientation(dtoken, 1, 0); SurfaceComposerClient::setDisplayProjection(dtoken,1,layerStackRect,displayRect); SurfaceComposerClient::getDisplayInfo(dtoken, &dinfo); int curWidth = dinfo.w; int curHeight = dinfo.h; ALOGD("hcm +11+curWidth = %d+curHeight = %d\n",curWidth,curHeight); ALOGD("hcm +11+curWidth = %d+curHeight = %d\n",curWidth,curHeight); if (dinfo.orientation % 2 == 0) { curWidth = dinfo.w; curHeight = dinfo.h; }else{ curWidth = dinfo.h; curHeight = dinfo.w; } ALOGD("hcm +22+curWidth = %d+curHeight = %d+dinfo.orientation = %d\n",curWidth,curHeight,dinfo.orientation); ALOGD("hcm +22+curWidth = %d+curHeight = %d\n",curWidth,curHeight); curWidth = 1024; curHeight = 600; // create the native surface sp<SurfaceControl> videoControl = session()->createSurface(String8("BootVideo"), curWidth, curHeight, PIXEL_FORMAT_RGB_565); SurfaceComposerClient::openGlobalTransaction(); videoControl ->setLayer(0x40000000); SurfaceComposerClient::closeGlobalTransaction(); sp<Surface> sf = videoControl->getSurface(); sp<ISurfaceTexture> new_st; new_st = sf->getSurfaceTexture(); mp->setVideoSurfaceTexture(new_st); #endif mp->prepare(); mp->setVolume(1.0f,1.0f); mp->start(); int msec = 0; mp->getDuration(&msec); ALOGD("mp->getDuration---hcm---msec = %ld\n",msec); usleep((msec)*1000-128); mp->stop(); sf.clear(); videoControl.clear(); close(fd); free(bf); ret = remove(TEM_VIDEO_PATH); if(ret < 0){ ALOGD("hcm ++++++++++remove++++++++++++failed++error = %d\n",errno); } Rect cLayerStackRect(600,1024) ; Rect cDisplayRect(600,1024) ; SurfaceComposerClient::setDisplayProjection(dtoken,0,cLayerStackRect,cDisplayRect); // create the native surface sp<SurfaceControl> control = session()->createSurface(String8("BootAnimation"), 600, 1024, PIXEL_FORMAT_RGB_565); SurfaceComposerClient::openGlobalTransaction(); control->setLayer(0x40000000); SurfaceComposerClient::closeGlobalTransaction(); } }d、移植过程遇到的问题:
1、视频方向和surface方向不一致,简单方法改变视频的方向
2、视频、音频格式不匹配出现卡顿现象,更换视频的音频视频编码格式
3、视频一定的播放完毕或者一定的释放SurfaceTexture,所以的注意结束时间,或者在结束bootanimation前告知播放器停止播放mp->stop();