最早发布在 简书 ,是做项目过程中的笔记记录
视频播放原理:
系统会首先确定视频的格式,然后得到视频的编码..然后对编码进行解码,得到一帧一帧的图像,最后在画布上进行迅速更新,显然需要在独立的线程中完成,这时就需要使用surfaceView了
VideoView vv = (VideoView) findViewById(R.id.vv);
//播放res/raw中的文件,文件名小写字母,格式: flv,mp4...
//Uri.parse("file:///sdcard/sdCardFile.3gp")
//vv.setVideoURI(Uri.parse("http://..."));
vv.setVideoURI(Uri.parse("android.resource://" + getPackageName() + "/" + R.raw.shuai_dan_ge));
vv.start();
vv.requestFocus();
/* 其他方法: vv.pause(); vv.stop(); vv.resume(); vv.setOnPreparedListener(this); vv.setOnErrorListener(this); vv.setOnCompletionListener(this);** Error信息处理 : 经常会碰到视频编码格式不支持的情况,这里还是处理一下,若不想弹出提示框就返回true; http://developer.android.com/intl/zh-cn/reference/android/media/MediaPlayer.OnErrorListener.html @Override public boolean onError(MediaPlayer mp, int what, int extra) { if(what==MediaPlayer.MEDIA_ERROR_SERVER_DIED){ Log.v(TAG,"Media Error,Server Died"+extra); }else if(what==MediaPlayer.MEDIA_ERROR_UNKNOWN){ Log.v(TAG,"Media Error,Error Unknown "+extra); } return false; } */
QCMediaPlayer.java
//常见错误: "无法播放此视频" -红米1s电信版4.4.4无法播放,但在三星s6(5.1.1)上就可以播放
//播放源:http://27.152.191.198/c12.e.99.com/b/p/67/c4ff9f6535ac41a598bb05bf5b05b185/c4ff9f6535ac41a598bb05bf5b05b185.v.854.480.f4v
MediaPlayer-JNI: QCMediaPlayer mediaplayer NOT present
MediaPlayer: Unable to create media player
MediaPlayer: Couldn't open file on client side, trying server side
MediaPlayer: error (1, -2147483648)
MediaPlayer: Error (1,-2147483648)
有人说 用下面的方式可以处理该异常,但我是使用系统封装好的控件,这个操作不到吧? 先记录下:
MediaPlayer player = MediaPlayer.create(this, Uri.parse(sound_file_path));
MediaPlayer player = MediaPlayer.create(this, soundRedId, loop);
模拟器(nexus 6 android 5.1)下的日志更完整点,记录在此:
OkHttp: [{"type":1,"quality":1,"audio_index":0,"urls":["http://v11.huayu.nd/b/p/3038/5adfdf0893e64af18c19ddfd4b44e809/5adfdf0893e64af18c19ddfd4b44e809.v.640.360.f4v"]},{"type":2,"quality":1,"audio_index":0,"urls":["http://v11.huayu.nd/b/p/3038/5adfdf0893e64af18c19ddfd4b44e809/5adfdf0893e64af18c19ddfd4b44e809.v.640.360.mp4"]},{"type":1,"quality":2,"audio_index":0,"urls":["http://v11.huayu.nd/b/p/3038/5adfdf0893e64af18c19ddfd4b44e809/5adfdf0893e64af18c19ddfd4b44e809.v.854.480.f4v"]},{"type":2,"quality":2,"audio_index":0,"urls":["http://v11.huayu.nd/b/p/3038/5adfdf0893e64af18c19ddfd4b44e809/5adfdf0893e64af18c19ddfd4b44e809.v.854.480.mp4"]},{"type":1,"quality":3,"audio_index":0,"urls":["http://v11.huayu.nd/b/p/3038/5adfdf0893e64af18c19ddfd4b44e809/5adfdf0893e64af18c19ddfd4b44e809.v.1280.720.f4v"]},{"type":2,"quality":3,"audio_index":0,"urls":["http://v11.huayu.nd/b/p/3038/5adfdf0893e64af18c19ddfd4b44e809/5adfdf0893e64af18c19ddfd4b44e809.v.1280.720.mp4"]}]
OkHttp: <-- END HTTP (957-byte body)
// 我播放的是第一个url
AudioTrack: AUDIO_OUTPUT_FLAG_FAST denied by client
WindowManager: Adding window Window{7421602 u0 SurfaceView} at 4 of 10 (before Window{756e39b u0 com.nd.app.factory.appnew99u557557/com.nd.sdp.component.slp.student.VideoPlayActivity})
MediaFocusControl: AudioFocus requestAudioFocus() from android.media.AudioManager@223203d2 req=1flags=0x0
03-30 20:42:25.394 1920-1920/com.nd.app.factory.appnew99u557557 D/MediaPlayer: Couldn't open file on client side, trying server side
ALooperRoster: failed to post message. Target handler 0 still registered, but object gone.
NuCachedSource2: caching reached eos.
FFmpegExtractor: SniffFFMPEG
FFmpegExtractor: android-source:0xf5987160
FFMPEG: android source begin open
FFMPEG: android open, url: android-source:0xf5987160
FFMPEG: ffmpeg open android data source success, source ptr: 0xf5987160
FFMPEG: android source open success
FFMPEG: Input #0, flv, from 'android-source:0xf5987160':
FFMPEG: Metadata:
FFMPEG: metadatacreator : Yet Another Metadata Injector for FLV - Version 1.9
FFMPEG: hasKeyframes : true
FFMPEG: hasVideo : true
FFMPEG: hasAudio : true
FFMPEG: hasMetadata : true
FFMPEG: canSeekToEnd : true
FFMPEG: datasize : 77960
FFMPEG: videosize : 31295
FFMPEG: audiosize : 44797
FFMPEG: lasttimestamp : 7
FFMPEG: lastkeyframetimestamp: 7
FFMPEG: lastkeyframelocation: 78625
FFMPEG: Duration: 00:00:06.76, start: 0.004000, bitrate: 93 kb/s
FFMPEG: Stream #0:0: Video: h264 (High), yuv420p, 640x360 [SAR 1:1 DAR 16:9], 34 kb/s, 25 fps, 25 tbr, 1k tbn, 50 tbc
FFMPEG: Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp, 48 kb/s
FFmpegExtractor: FFmpegExtrator, url: android-source:0xf5987160, format_name: flv, format_long_name: FLV (Flash Video)
FFmpegExtractor: list the formats suppoted by ffmpeg:
FFmpegExtractor: ========================================
FFmpegExtractor: format_names[00]: mpeg
FFmpegExtractor: format_names[01]: mpegts
FFmpegExtractor: format_names[02]: mov,mp4,m4a,3gp,3g2,mj2
FFmpegExtractor: format_names[03]: matroska,webm
FFmpegExtractor: format_names[04]: asf
FFmpegExtractor: format_names[05]: rm
FFmpegExtractor: format_names[06]: flv
FFmpegExtractor: format_names[07]: swf
FFmpegExtractor: format_names[08]: avi
FFmpegExtractor: format_names[09]: ape
FFmpegExtractor: format_names[10]: dts
FFmpegExtractor: format_names[11]: flac
FFmpegExtractor: format_names[12]: ac3
FFmpegExtractor: format_names[13]: wav
FFmpegExtractor: format_names[14]: ogg
FFmpegExtractor: format_names[15]: vc1
FFmpegExtractor: format_names[16]: hevc
FFmpegExtractor: ========================================
FFmpegExtractor: suppoted codec(h264) by official Stagefright
FFmpegExtractor: suppoted codec(aac) by official Stagefright
FFMPEG: android source close
FFmpegExtractor: sniff through BetterSniffFFMPEG success
FFmpegExtractor: ffmpeg detected media content as 'video/x-flv' with confidence 0.08
GenericSource: Failed to init from data source!
MediaPlayer: error (1, -2147483648)
MediaPlayer: Error (1,-2147483648)
androidmanifest.xml
中依然还是定义竖屏,并定义一个切换横纵屏按钮 btnSwitch
:<activity android:name="lynxz.org.video.VideoActivity" android:configChanges="keyboard|orientation|screenSize" android:screenOrientation="portrait" android:theme="@style/Theme.AppCompat.Light.NoActionBar"/>
VidioView
外层套一个容器,比如:<RelativeLayout android:id="@+id/rl_vv" android:layout_width="match_parent" android:layout_height="wrap_content" android:background="@android:color/black">
<VideoView android:id="@+id/vv" android:layout_width="match_parent" android:layout_height="wrap_content" android:layout_centerInParent="true" android:minHeight="200dp"/>
</RelativeLayout>
这么做是为了在切换屏幕方向的时候对 rl_vv
进行拉伸,而内部的 VideoView
会重新计算宽高,我们看看其 onMeasure()
源码,会根据视频与view的宽高比来重新设定,但若是直接具体指定了view的宽高,则视频会被拉伸:
//VideoView.java
@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec) {
int width = getDefaultSize(mVideoWidth, widthMeasureSpec);
int height = getDefaultSize(mVideoHeight, heightMeasureSpec);
......
setMeasuredDimension(width, height);
}
btnSwitch.setOnClickListener(View -> {
if (getRequestedOrientation() == ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE) {
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
} else {
setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE);
}
});
设置VideoView布局尺寸
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
if (vv == null) {
return;
}
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE){//横屏
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
getWindow().getDecorView().invalidate();
float height = DensityUtil.getWidthInPx(this);
float width = DensityUtil.getHeightInPx(this);
mRlVv.getLayoutParams().height = (int) width;
mRlVv.getLayoutParams().width = (int) height;
} else {
final WindowManager.LayoutParams attrs = getWindow().getAttributes();
attrs.flags &= (~WindowManager.LayoutParams.FLAG_FULLSCREEN);
getWindow().setAttributes(attrs);
getWindow().clearFlags(WindowManager.LayoutParams.FLAG_LAYOUT_NO_LIMITS);
float width = DensityUtil.getWidthInPx(this);
float height = DensityUtil.dip2px(this, 200.f);
mRlVv.getLayoutParams().height = (int) height;
mRlVv.getLayoutParams().width = (int) width;
}
}
自定义工具类
//DensityUtil.java
public static final float getHeightInPx(Context context) {
final float height = context.getResources().getDisplayMetrics().heightPixels;
return height;
}
public static final float getWidthInPx(Context context) {
final float width = context.getResources().getDisplayMetrics().widthPixels;
return width;
}
参考文章
@TargetApi(Build.VERSION_CODES.ICE_CREAM_SANDWICH)
private void createVideoThumbnail() {
Observable<Bitmap> observable = Observable.create(new Observable.OnSubscribe<Bitmap>() {
@Override
public void call(Subscriber<? super Bitmap> subscriber) {
Bitmap bitmap = null;
MediaMetadataRetriever retriever = new MediaMetadataRetriever();
int kind = MediaStore.Video.Thumbnails.MINI_KIND;
if (Build.VERSION.SDK_INT >= 14) {
retriever.setDataSource(mVideoUrl, new HashMap<String, String>());
} else {
retriever.setDataSource(mVideoUrl);
}
bitmap = retriever.getFrameAtTime();
subscriber.onNext(bitmap);
retriever.release();
}
});
observable.observeOn(Schedulers.io())
.subscribeOn(AndroidSchedulers.mainThread())
.subscribe(new Action1<Bitmap>() {
@Override
public void call(Bitmap bitmap) {
//设置封面
mYourVideoPlayerContainer.setBackgroundDrawable(new BitmapDrawable(bitmap));
}
});
}
<uses-permission android:name="android.permission.WRITE_SETTINGS"/>
<uses-permission android:name="android.permission.VIBRATE"/> //按需申请
/*设置当前屏幕亮度值 0--255,并使之生效*/
private void setScreenBrightness(float value) {
WindowManager.LayoutParams lp = getWindow().getAttributes();
lp.screenBrightness = lp.screenBrightness + value / 255.0f;
Vibrator vibrator;
if (lp.screenBrightness > 1) {
lp.screenBrightness = 1;
// vibrator = (Vibrator) getSystemService(VIBRATOR_SERVICE);
// long[] pattern = {10, 200}; // OFF/ON/OFF/ON...
// vibrator.vibrate(pattern, -1);
} else if (lp.screenBrightness < 0.2) {
lp.screenBrightness = (float) 0.2;
// vibrator = (Vibrator) getSystemService(VIBRATOR_SERVICE);
// long[] pattern = {10, 200}; // OFF/ON/OFF/ON...
// vibrator.vibrate(pattern, -1);
}
getWindow().setAttributes(lp);
// 保存设置的屏幕亮度值
// Settings.System.putInt(getContentResolver(), Settings.System.SCREEN_BRIGHTNESS, (int) value);
}
// value 可取值: Settings.System.SCREEN_BRIGHTNESS_MODE_AUTOMATIC / SCREEN_BRIGHTNESS_MODE_MANUAL
private void setScreenMode(int value) {
Settings.System.putInt(getContentResolver(), Settings.System.SCREEN_BRIGHTNESS_MODE, value);
}
mGestureDetector = new GestureDetector(this, mGestureListener);
vv.setOnTouchListener(this);
@Override
public boolean onTouch(View v, MotionEvent event) {
return mGestureDetector.onTouchEvent(event);
}
onDown()
/ onScroll()
返回trueprivate android.view.GestureDetector.OnGestureListener mGestureListener = new GestureDetector.OnGestureListener() {
@Override
public boolean onDown(MotionEvent e) {
return true;
}
@Override
public void onShowPress(MotionEvent e) {
}
@Override
public boolean onSingleTapUp(MotionEvent e) {
return false;
}
@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
final double FLING_MIN_VELOCITY = 0.5;
final double FLING_MIN_DISTANCE = 0.5;
if (e1.getY() - e2.getY() > FLING_MIN_DISTANCE
&& Math.abs(distanceY) > FLING_MIN_VELOCITY) {
setScreenBrightness(20);
}
if (e1.getY() - e2.getY() < FLING_MIN_DISTANCE
&& Math.abs(distanceY) > FLING_MIN_VELOCITY) {
setScreenBrightness(-20);
}
return true;
}
@Override
public void onLongPress(MotionEvent e) {
}
@Override
public boolean onFling(MotionEvent e1, MotionEvent e2, float velocityX, float velocityY) {
return true;
}
};
修改上方的 onScroll()
方法,调用以下操作
private void setVoiceVolume(float value) {
int currentVolume = mAudioManager.getStreamVolume(AudioManager.STREAM_MUSIC);
int maxVolume = mAudioManager.getStreamMaxVolume(AudioManager.STREAM_MUSIC);
int flag = value > 0 ? -1 : 1;
currentVolume += flag * 0.1 * maxVolume;
// 对currentVolume进行限制
mAudioManager.setStreamVolume(AudioManager.STREAM_MUSIC, currentVolume, 0);
}
断点跟踪到 VideoView
的 seekTo()
方法,时间值都是准确的,最后系统调用native方法后更新进度条就回跳了,查看了各大视频播放平台的客户端,比较普遍存在,暂时无法处理:
找到些资源:
1. 关于Android VideoView seekTo不准确的解决方案
2. 视频关键帧提取
第一个链接中提到了关键帧问题,我找了个视频测试了下,seekTo到固定的时间点,则跳变的位置也固定 ==! 虽然native我还看不懂,但是当时测试也大意了,没发现这个规律;
现象: 在视频播放时,使页面 onPause()
,之后再恢复,则 videoView
会重新开始播放,临时的处理方案是在 onPause()
的时候记录当前播放进度位置,在 onResume()
的时候拖动到该进度位置,但是该方案仍会有黑屏现象,代码如下:
int mPlayingPos = 0;
@Override
protected void onPause() {
mPlayingPos = mVideoView.getCurrentDuration(); //先获取再stopPlay(),原因自己看源码
mVideoView.stopPlayback();
super.onPause();
}
@Override
protected void onResume() {
if (mPlayingPos > 0) {
//此处为更好的用户体验,可添加一个progressBar,有些客户端会在这个过程中隐藏底下控制栏,这方法也不错
mVideoView.start();
mVideoView.seekTo(mPlayingPos);
mPlayingPos = 0;
}
super.onResume();
}
找到些可能相关的文章,链接已失效,快照如下(还得去看看 surfaceView
啊 ~ ~# ):
另一篇类似的: android开发常见问题 问题7,也指明是 surfaceview
的原因,之所以是黑色的见后面的解释:
Activity 调用的顺序是 onPause() -> onStop()
SurfaceView 调用了 surfaceDestroyed() 方法
然后再切回程序
Activity 调用的顺序是 onRestart() -> onStart() -> onResume ()
SurfaceView` 调用了 surfaceChanged() -> surfaceCreated() 方法
按挂断键或锁定屏幕
Activity 只调用 onPause() 方法
解锁后 Activity 调用 onResume() 方法
SurfaceView 什么方法都不调用
SurfaceView
会显示黑色区域的原因: SurfaceView 的 draw 和 dispatchDraw 方法中看到,SurfaceView 中,windownType变量被初始化为WindowManager.LayoutParams.TYPE_APPLICATION_MEDIA,所以在创建绘制这个 View 的过程中整个 Canvas 会被涂成黑色
特点:
双缓存 (**!还是挺模糊的)
在内存中创建一片内存区域,把将要绘制的图片预先绘制到内存中,在绘制显示的时候直接获取缓冲区的图片进行绘制。更具体一点来说:先通过setBitmap方法将要绘制的所有的图形绘制到一个Bitmap上也就是先在内存空间完成,然后再来调用drawBitmap方法绘制出这个Bitmap,显示在屏幕上。
这是 另一种 说法:
SurfaceView的双缓存有些不一样,SurfaceView有两个缓存,一个是front buffer,一个back buffer,这两个buffer是交替显示(flip)到界面上的,即当前看到的是front buffer的内容,如果此时界面发生变化,那么back buffer就会在原来的基础上把内容画好,然后front buffer与back buffer交换一下位置;需要注意的是,由于存在两个buffer,如果每次都把所有内容都重新画一遍则不会有什么问题,但如果每次画的内容都是一部分,那就有问题了:一部分、一部分地交替显示,这当然不是我们想要的。解决的办法是:每次清屏,然后把所有东西再画一遍。
实现:
// 创建一个200*200的缓冲区,存放目标Bitmap
Bitmap bitmapBuffer = Bitmap.createBitmap(200, 200, Config.ARGB_8888);
//设置目标内容绘制到缓冲区
Canvas canvas = new Canvas(bitmapBuffer);
//将要绘制的内容绘制在缓冲bitmap中,比如一张图片
Bitmap pic = ((BitmapDrawable) getResources().getDrawable(R.drawable.qq)).getBitmap();
canvas.drawBitmap(pic,0,0,paint);
//最后将缓冲内容一次性输出到屏幕上
canvas.drawBitmap(bitmapBuffer,0,0,paint);
surfaceView.getHolder()
获取 holder 对象;surfaceCreated
surface创建的时候调用,一般在该方法中启动绘图线程;surfaceChanged()
surface尺寸发生改变的时候调用,如将横竖屏切换;surfaceDestroyed
surface被销毁的时候调用,一般在该方法中终止绘图;//SurfaceView.java
int mWindowType = WindowManager.LayoutParams.TYPE_APPLICATION_MEDIA;
public void draw(Canvas canvas) {
if (mWindowType != WindowManager.LayoutParams.TYPE_APPLICATION_PANEL) {
// draw() is not called when SKIP_DRAW is set
if ((mPrivateFlags & PFLAG_SKIP_DRAW) == 0) {
// punch a whole in the view-hierarchy below us
canvas.drawColor(0, PorterDuff.Mode.CLEAR);
}
}
super.draw(canvas);
}
手机 “菜单键” 导致应用被stop,虽然此时看起来可见
SurfaceView.java
的注释: 在调用菜单键的时候虽然页面貌似可见,但实际已经调用了onStop()方法了,而surface在window不可见时会销毁:
The Surface will be created for you while the SurfaceView’s window is
visible; you should implement {@link SurfaceHolder.Callback#surfaceCreated}
and {@link SurfaceHolder.Callback#surfaceDestroyed} to discover when the
Surface is created and destroyed as the window is shown and hidden.
VideoView无法播放f4v格式(三星s6可以播放,红米1s(4.4.4)播放失败)….
以后能力够了可以参考下这篇 :