这一章的主要内容是搜索手机本地视频,添加到ListView列表里,每一个表项含有这个视频的缩略图,视频的播放时间,视频的标题,在搜索本地视频(1)中我们先制作搜索功能。
Video.java--视频相关的属性类
package com.zhangjie.graduation.videopalyer.videofile;
import java.io.Serializable;
import com.zhangjie.graduatioponent.LoadedImage;
public class Video implements Serializable{
/**
*
*/
private static final long serialVersionUID = -7920222595800367956L;
private int id;
private String title;
private String album;
private String artist;
private String displayName;
private String mimeType;
private String path;
private long size;
private long duration;
private LoadedImage image;
/**
*
*/
public Video() {
super();
}
/**
* @param id
* @param title
* @param album
* @param artist
* @param displayName
* @param mimeType
* @param data
* @param size
* @param duration
*/
public Video(int id, String title, String album, String artist,
String displayName, String mimeType, String path, long size,
long duration) {
super();
this.id = id;
this.title = title;
this.album = album;
this.artist = artist;
this.displayName = displayName;
this.mimeType = mimeType;
this.path = path;
this.size = size;
this.duration = duration;
}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public String getTitle() {
return title;
}
public void setTitle(String title) {
this.title = title;
}
public String getAlbum() {
return album;
}
public void setAlbum(String album) {
this.album = album;
}
public String getArtist() {
return artist;
}
public void setArtist(String artist) {
this.artist = artist;
}
public String getDisplayName() {
return displayName;
}
public void setDisplayName(String displayName) {
this.displayName = displayName;
}
public String getMimeType() {
return mimeType;
}
public void setMimeType(String mimeType) {
this.mimeType = mimeType;
}
public String getPath() {
return path;
}
public void setPath(String path) {
this.path = path;
}
public long getSize() {
return size;
}
public void setSize(long size) {
this.size = size;
}
public long getDuration() {
return duration;
}
public void setDuration(long duration) {
this.duration = duration;
}
public LoadedImage getImage(){
return image;
}
public void setImage(LoadedImage image){
this.image = image;
}}
AbstructProvider.java ---一个接口,来获取搜索的视频的一个集合
package com.zhangjie.graduation.videopalyer.videofile;import java.util.List;
public interface AbstructProvider {
public ListVideoProvider.java--- 实现AbstructProvider接口,通过cursor来搜索视频的相关信息
package com.zhangjie.graduation.videopalyer.videofile;import java.util.ArrayList;
import java.util.List;import ntent.Context;
import android.database.Cursor;
import android.provider.MediaStore;public class VideoProvider implements AbstructProvider {
private Context context;
public VideoProvider(Context context) {
ntext = context;
}
@Override
public List}
最后在主类中使用如下代码来获取最终得到的视频相关信息集合
AbstructProvider provider = new VideoProvider(this);
ListlistVideos = provider.getList(); 在上面的listVideos包含了本地所有的视频的相关信息,下一章将会用到listVideos数据。
在Android本地视频播放器开发--搜索本地视频(1) ;中我们获取了本地视频的数据,在这一章里面,我们将获取的数据通过LIstView来动态的实现出来。
1、首先介绍布局代码,主布局代码只含有一个LIstView --jie_video.xml
android:layout_width="match_parent"
android:layout_height="match_parent" >
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentBottom="true"
android:id="@+id/jievideolistfile"
/>
2、下一个布局就是listView的子项的布局
android:layout_width="match_parent"
android:layout_height="wrap_content" >
android:layout_width="120dp"
android:layout_height="80dp"
android:id="@+id/video_img"
android:contentDescription="@string/cont"
/>
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_toRightOf="@id/video_img"
android:layout_alignBottom="@id/video_img"
>
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/video_title"
android:gravity="center"
android:layout_marginTop="5dp"
android:text="@string/title"
/>
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/video_time"
android:gravity="center"
android:layout_alignParentBottom="true"
android:layout_marginBottom="5dp"
android:text="@string/time"
/>
3、布局都写好了,然后就是Activity的编写,这里涉及到视频的缩略图的显示,所以要用到异步加载功能
JieVideo.java
package com.zhangjie.graduation.videopalyer;
import java.util.List;
import com.zhangjie.graduatioponent.JieVideoListViewAdapter;
import com.zhangjie.graduatioponent.LoadedImage;
import com.zhangjie.graduation.videopalyer.videofile.AbstructProvider;
import com.zhangjie.graduation.videopalyer.videofile.Video;
import com.zhangjie.graduation.videopalyer.videofile.VideoProvider;import android.app.Activity;
import ntent.Intent;
import android.graphics.Bitmap;
import android.graphics.drawable.BitmapDrawable;
import dia.ThumbnailUtils;
import android.os.AsyncTask;
import android.os.Bundle;
import android.provider.MediaStore.Video.Thumbnails;
import android.view.View;
import android.widget.AdapterView;
import android.widget.ImageView;
import android.widget.ListView;
import android.widget.AdapterView.OnItemClickListener;public class JieVideo extends Activity{
public JieVideo instance = null;
ListView mJieVideoListView;
JieVideoListViewAdapter mJieVideoListViewAdapter;
ListlistVideos;
int videoSize;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.jie_video);
instance = this;
AbstructProvider provider = new VideoProvider(instance);
listVideos = provider.getList();
videoSize = listVideos.size();
mJieVideoListViewAdapter = new JieVideoListViewAdapter(this, listVideos);
mJieVideoListView = (ListView)findViewById(R.id.jievideolistfile);
mJieVideoListView.setAdapter(mJieVideoListViewAdapter);
mJieVideoListView.setOnItemClickListener(new OnItemClickListener() {
@Override
public void onItemClick(AdapterView> parent, View view, int position,
long id) {
Intent intent = new Intent();
intent.setClass(JieVideo.this, JieVideoPlayer.class);
Bundle bundle = new Bundle();
bundle.putSerializable("video", listVideos.get(position));
intent.putExtras(bundle);
startActivity(intent);
}
});
loadImages();
}
/**
* Load images.
*/
private void loadImages() {
final Object data = getLastNonConfigurationInstance();
if (data == null) {
new LoadImagesFromSDCard().execute();
} else {
final LoadedImage[] photos = (LoadedImage[]) data;
if (photos.length == 0) {
new LoadImagesFromSDCard().execute();
}
for (LoadedImage photo : photos) {
addImage(photo);
}
}
}
private void addImage(LoadedImage... value) {
for (LoadedImage image : value) {
mJieVideoListViewAdapter.addPhoto(image);
mJieVideoListViewAdapter.notifyDataSetChanged();
}
}
@Override
public Object onRetainNonConfigurationInstance() {
final ListView grid = mJieVideoListView;
final int count = grid.getChildCount();
final LoadedImage[] list = new LoadedImage[count];for (int i = 0; i < count; i++) {
final ImageView v = (ImageView) grid.getChildAt(i);
list[i] = new LoadedImage(
((BitmapDrawable) v.getDrawable()).getBitmap());
}return list;
}
/**
* 获取视频缩略图
* @param videoPath
* @param width
* @param height
* @param kind
* @return
*/
private Bitmap getVideoThumbnail(String videoPath, int width , int height, int kind){
Bitmap bitmap = null;
bitmap = ThumbnailUtils.createVideoThumbnail(videoPath, kind);
bitmap = ThumbnailUtils.extractThumbnail(bitmap, width, height, ThumbnailUtils.OPTIONS_RECYCLE_INPUT);
return bitmap;
}class LoadImagesFromSDCard extends AsyncTask
4、上面还涉及一个就是LIstView的适配器,这里是继承BaseAdapter。
JieVideoListViewAdapter.java
package com.zhangjie.graduatioponent;
import java.util.ArrayList;
import java.util.List;import com.zhangjie.graduation.videopalyer.R;
import com.zhangjie.graduation.videopalyer.videofile.Video;import ntent.Context;
import android.view.LayoutInflater;
import android.view.View;
import android.view.ViewGroup;
import android.widget.BaseAdapter;
import android.widget.ImageView;
import android.widget.TextView;public class JieVideoListViewAdapter extends BaseAdapter{
List
listVideos;
int local_postion = 0;
boolean imageChage = false;
private LayoutInflater mLayoutInflater;
private ArrayListphotos = new ArrayList ();
public JieVideoListViewAdapter(Context context, ListlistVideos){
mLayoutInflater = LayoutInflater.from(context);
this.listVideos = listVideos;
}
@Override
public int getCount() {
return photos.size();
}
public void addPhoto(LoadedImage image){
photos.add(image);
}
@Override
public Object getItem(int position) {
return position;
}
@Override
public long getItemId(int position) {
return position;
}@Override
public View getView(int position, View convertView, ViewGroup parent) {
ViewHolder holder = null;
if (convertView == null) {
holder = new ViewHolder();
convertView = mLayoutInflater.inflate(R.layout.video_list_view, null);
holder.img = (ImageView)convertView.findViewById(R.id.video_img);
holder.title = (TextView)convertView.findViewById(R.id.video_title);
holder.time = (TextView)convertView.findViewById(R.id.video_time);
convertView.setTag(holder);
}else {
holder = (ViewHolder)convertView.getTag();
}
holder.title.setText(listVideos.get(position).getTitle());//ms
long min = listVideos.get(position).getDuration() /1000 / 60;
long sec = listVideos.get(position).getDuration() /1000 % 60;
holder.time.setText(min+" : "+sec);
holder.img.setImageBitmap(photos.get(position).getBitmap());
return convertView;
}public final class ViewHolder{
public ImageView img;
public TextView title;
public TextView time;
}
}5、还有一个在JieVideo类中使用了一个LoadedImage的类,它的代码如下:
package com.zhangjie.graduatioponent;
import android.graphics.Bitmap;
public class LoadedImage {
Bitmap mBitmap;public LoadedImage(Bitmap bitmap) {
mBitmap = bitmap;
}public Bitmap getBitmap() {
return mBitmap;
}
}那就看看最终的实现效果吧:
在Android本地视频播放器开发中的搜索本地视频章节中,我们能够搜索本地视频并且显示每个视频的图片、标题、时间长度,当然如果需要添加其他的例如视频的长度和宽度可以使用Video类中的方法,既然我们获取到视频后,那么接下来就是解码视频,解码视频我使用的是ffmpeg,所以这张就是使用NDK编译手机可以使用的FFmpeg库。
首先到官网下载最新的源码,然后在ffmpeg目录下面创建一个脚本config.sh,内容如下:
NDK=/opt/android-ndk-r8d
PLATFORM=$NDK/platforms/android-8/arch-arm/
PREBUILT=$NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86
LOCAL_ARM_NEON=true
CPU=armv7-a
OPTIMIZE_CFLAGS="-mfloat-abi=softfp -mfpu=neon -marm -mcpu=cortex-a8"
PREFIX=./android/$CPU
./configure --target-os=linux \
--prefix=$PREFIX \
--enable-cross-compile \
--arch=arm \
--enable-nonfree \
--enable-asm \
--cpu=cortex-a8 \
--enable-neon \
--cc=$PREBUILT/bin/arm-linux-androideabi-gcc \
--cross-prefix=$PREBUILT/bin/arm-linux-androideabi- \
--nm=$PREBUILT/bin/arm-linux-androideabi-nm \
--sysroot=$PLATFORM \
--extra-cflags=" -O3 -fpic -DANDROID -DHAVE_SYS_UIO_H=1 $OPTIMIZE_CFLAGS " \
--disable-shared \
--enable-static \
--extra-ldflags="-Wl,-rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -nostdlib -lc -lm -ldl -llog" \
--disable-ffmpeg \
--disable-ffplay \
--disable-ffprobe \
--disable-ffserver \
--disable-swscale \
--disable-swresample \
--enable-avformat \
--enable-avcodec \
--disable-optimizations \
--disable-debug \
--disable-doc \
--disable-stripping \
--enable-pthreads \
--disable-yasm \
--enable-zlib \
--enable-pic \
--enable-smallmake clean
make -j4 install$PREBUILT/bin/arm-linux-androideabi-ar d libavcodec/libavcodec.a inverse.o
$PREBUILT/bin/arm-linux-androideabi-ld -rpath-link=$PLATFORM/usr/lib -L$PLATFORM/usr/lib -soname libffmpeg-neon.so -shared -nostdlib -z noexecstack -Bsymbolic --whole-archive --no-undefined -o $PREFIX/libffmpeg-neon.so libavcodec/libavcodec.a libavformat/libavformat.a libavutil/libavutil.a -lc -lm -lz -ldl -llog --warn-once --dynamic-linker=/system/bin/linker $PREBUILT/lib/gcc/arm-linux-androideabi/4.4.3/libgcc.a
注意上面的NDK要换成你本地的路径。这个脚本我是启用neon的,所以只能运行在有neon的手机上,如果想要运行到非neon的手机上,则需要去掉neon特性。
然后执行这个脚本,就会生成libffmpeg-neon.so
在下一张中会使用ffmpeg解码视频文件中音频。
在上一章中Android本地视频播放器开发--NDK编译FFmpeg能够获取编译出来的ffmpeg库(见 ),接下来就是调用ffmpeg来实现解码,这里我们先解码音频,然后在播放音频,同时为了适应性我会用不同的方法进行播放例如使用Android提供的AudioTrack,SDL、OpengAL,OpenSL ES,最终合入视频播放器的是OpenSL ES,这样可以减少CPU的利用率。接下来在这一章中,先简单的介绍如何解码音频,在下一章中,实现音频的播放。
首先就是编写调用ffmpeg的文件这里命名为:Decodec_Audio.c#include
#include#include
#include "VideoPlayerDecode.h"
#include "../ffmpeg/libavutil/avutil.h"
#include "../ffmpeg/libavcodec/avcodec.h"
#include "../ffmpeg/libavformat/avformat.h"#define LOGI(...) ((void)__android_log_print(ANDROID_LOG_INFO, "graduation", __VA_ARGS__))
AVFormatContext *pFormatCtx = NULL;
int audioStream, delay_time, videoFlag = 0;
AVCodecContext *aCodecCtx;
AVCodec *aCodec;
AVFrame *aFrame;
AVPacket packet;
int frameFinished = 0;JNIEXPORT jint JNICALL Java_com_zhangjie_graduation_videopalyer_jni_VideoPlayerDecode_VideoPlayer
(JNIEnv *env, jclass clz, jstring fileName)
{
const char* local_title = (*env)->GetStringUTFChars(env, fileName, NULL);
av_register_all();//注册所有支持的文件格式以及编解码器
/*
*只读取文件头,并不会填充流信息
*/
if(avformat_open_input(&pFormatCtx, local_title, NULL, NULL) != 0)
return -1;
/*
*获取文件中的流信息,此函数会读取packet,并确定文件中所有流信息,
*设置pFormatCtx->streams指向文件中的流,但此函数并不会改变文件指针,
*读取的packet会给后面的解码进行处理。
*/
if(avformat_find_stream_info(pFormatCtx, NULL) < 0)
return -1;
/*
*输出文件的信息,也就是我们在使用ffmpeg时能够看到的文件详细信息,
*第二个参数指定输出哪条流的信息,-1代表ffmpeg自己选择。最后一个参数用于
*指定dump的是不是输出文件,我们的dump是输入文件,因此一定要为0
*/
av_dump_format(pFormatCtx, -1, local_title, 0);
int i = 0;
for(i=0; i< pFormatCtx->nb_streams; i++)
{
if(pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO){
audioStream = i;
break;
}
}if(audioStream < 0)return -1;
aCodecCtx = pFormatCtx->streams[audioStream]->codec;
aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
if(avcodec_open2(aCodecCtx, aCodec, NULL) < 0)return -1;
aFrame = avcodec_alloc_frame();
if(aFrame == NULL)return -1;
int ret;
while(videoFlag != -1)
{
if(av_read_frame(pFormatCtx, &packet) < 0)break;
if(packet.stream_index == audioStream)
{
ret = avcodec_decode_audio4(aCodecCtx, aFrame, &frameFinished, &packet);
if(ret > 0 && frameFinished)
{
int data_size = av_samples_get_buffer_size(
aFrame->linesize,aCodecCtx->channels,
aFrame->nb_samples,aCodecCtx->sample_fmt, 0);
LOGI("audioDecodec :%d",data_size);
}}
usleep(50000);
while(videoFlag != 0)
{
if(videoFlag == 1)//暂停
{
sleep(1);
}else if(videoFlag == -1) //停止
{
break;
}
}
av_free_packet(&packet);
}
av_free(aFrame);
avcodec_close(aCodecCtx);
avformat_close_input(&pFormatCtx);
(*env)->ReleaseStringUTFChars(env, fileName, local_title);
}JNIEXPORT jint JNICALL Java_com_zhangjie_graduation_videopalyer_jni_VideoPlayerDecode_VideoPlayerPauseOrPlay
(JNIEnv *env, jclass clz)
{
if(videoFlag == 1)
{
videoFlag = 0;
}else if(videoFlag == 0){
videoFlag = 1;
}
return videoFlag;
}JNIEXPORT jint JNICALL Java_com_zhangjie_graduation_videopalyer_jni_VideoPlayerDecode_VideoPlayerStop
(JNIEnv *env, jclass clz)
{
videoFlag = -1;
}接下来就是编写Android.mk:
#######################################################
########## ffmpeg-prebuilt #######
#######################################################
#declare the prebuilt library
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpeg-prebuilt
LOCAL_SRC_FILES := ffmpeg/android/armv7-a/libffmpeg-neon.so
LOCAL_EXPORT_C_INCLUDES := ffmpeg/android/armv7-a/include
LOCAL_EXPORT_LDLIBS := ffmpeg/android/armv7-a/libffmpeg-neon.so
LOCAL_PRELINK_MODULE := true
include $(PREBUILT_SHARED_LIBRARY)########################################################
## ffmpeg-test-neno.so ########
########################################################
include $(CLEAR_VARS)
TARGET_ARCH_ABI=armeabi-v7a
LOCAL_ARM_MODE=arm
LOCAL_ARM_NEON=true
LOCAL_ALLOW_UNDEFINED_SYMBOLS=false
LOCAL_MODULE := ffmpeg-test-neon
LOCAL_SRC_FILES := jniffmpeg/Decodec_Audio.cLOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg/android/armv7-a/include \
$(LOCAL_PATH)/ffmpeg \
$(LOCAL_PATH)/ffmpeg/libavutil \
$(LOCAL_PATH)/ffmpeg/libavcodec \
$(LOCAL_PATH)/ffmpeg/libavformat \
$(LOCAL_PATH)/ffmpeg/libavcodec \
$(LOCAL_PATH)/ffmpeg/libswscale \
$(LOCAL_PATH)/jniffmpeg \
$(LOCAL_PATH)
LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt
LOCAL_LDLIBS := -llog -ljnigraphics -lz -lm $(LOCAL_PATH)/ffmpeg/android/armv7-a/libffmpeg-neon.so
include $(BUILD_SHARED_LIBRARY)然后在终端运行ndk-build,运行结果如下:
root@zhangjie:/Graduation/jni# ndk-build
Install : libffmpeg-neon.so => libs/armeabi/libffmpeg-neon.so
Compile arm : ffmpeg-test-neon <= Decodec_Audio.c
SharedLibrary : libffmpeg-test-neon.so
Install : libffmpeg-test-neon.so => libs/armeabi/libffmpeg-test-neon.so把编译出来的
libffmpeg-test-neon.so
libffmpeg-neon.so
拷贝到之前android功能下的libs/armeabi目录下面,点击视频,视频文件开始解码音频,当解码成功,则打印出解码音频包的大小:
06-07 04:51:30.953: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.000: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.109: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.156: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.257: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.304: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.406: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.460: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.554: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.609: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.710: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.757: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.859: I/graduation(7014): audioDecodec :2048
06-07 04:51:31.914: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.015: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.062: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.164: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.210: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.312: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.367: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.468: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.515: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.617: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.671: I/graduation(7014): audioDecodec :2048
06-07 04:51:32.773: I/graduation(7014): audioDecodec :2048在logcat里面可以看到解码音频成功,下面一章我们将解码出来的音频进行播放。
在Android本地视频播放器开发--ffmpeg解码视频文件中的音频(1)【】中我们从视频文件中解码出音频,这一章中将使用OpenSL ES来播放解码的音频数据,首先关于OpenSL ES这里暂不介绍,可以查看官网以及NDK中samples下面的native-audio里面的文件,这里我也是扣取了其中的代码,我们播放音频的部分在上一章的基础上进行添加的,代码如下:
#include
#include
#include#include
#include// for native audio
#include
#include#include "VideoPlayerDecode.h"
#include "../ffmpeg/libavutil/avutil.h"
#include "../ffmpeg/libavcodec/avcodec.h"
#include "../ffmpeg/libavformat/avformat.h"#define LOGI(...) ((void)__android_log_print(ANDROID_LOG_INFO, "graduation", __VA_ARGS__))
AVFormatContext *pFormatCtx = NULL;
int audioStream, delay_time, videoFlag = 0;
AVCodecContext *aCodecCtx;
AVCodec *aCodec;
AVFrame *aFrame;
AVPacket packet;
int frameFinished = 0;// engine interfaces
static SLObjectItf engineObject = NULL;
static SLEngineItf engineEngine;// output mix interfaces
static SLObjectItf outputMixObject = NULL;
static SLEnvironmentalReverbItf outputMixEnvironmentalReverb = NULL;// buffer queue player interfaces
static SLObjectItf bqPlayerObject = NULL;
static SLPlayItf bqPlayerPlay;
static SLAndroidSimpleBufferQueueItf bqPlayerBufferQueue;
static SLEffectSendItf bqPlayerEffectSend;
static SLMuteSoloItf bqPlayerMuteSolo;
static SLVolumeItf bqPlayerVolume;// aux effect on the output mix, used by the buffer queue player
static const SLEnvironmentalReverbSettings reverbSettings =
SL_I3DL2_ENVIRONMENT_PRESET_STONECORRIDOR;// file descriptor player interfaces
static SLObjectItf fdPlayerObject = NULL;
static SLPlayItf fdPlayerPlay;
static SLSeekItf fdPlayerSeek;
static SLMuteSoloItf fdPlayerMuteSolo;
static SLVolumeItf fdPlayerVolume;// pointer and size of the next player buffer to enqueue, and number of remaining buffers
static short *nextBuffer;
static unsigned nextSize;
static int nextCount;// this callback handler is called every time a buffer finishes playing
void bqPlayerCallback(SLAndroidSimpleBufferQueueItf bq, void *context)
{
assert(bq == bqPlayerBufferQueue);
assert(NULL == context);
// for streaming playback, replace this test by logic to find and fill the next buffer
if (--nextCount > 0 && NULL != nextBuffer && 0 != nextSize) {
SLresult result;
// enqueue another buffer
result = (*bqPlayerBufferQueue)->Enqueue(bqPlayerBufferQueue, nextBuffer, nextSize);
// the most likely other result is SL_RESULT_BUFFER_INSUFFICIENT,
// which for this code example would indicate a programming error
assert(SL_RESULT_SUCCESS == result);
}
}void createEngine(JNIEnv* env, jclass clazz)
{
SLresult result;// create engine
result = slCreateEngine(&engineObject, 0, NULL, 0, NULL, NULL);
assert(SL_RESULT_SUCCESS == result);// realize the engine
result = (*engineObject)->Realize(engineObject, SL_BOOLEAN_FALSE);
assert(SL_RESULT_SUCCESS == result);// get the engine interface, which is needed in order to create other objects
result = (*engineObject)->GetInterface(engineObject, SL_IID_ENGINE, &engineEngine);
assert(SL_RESULT_SUCCESS == result);// create output mix, with environmental reverb specified as a non-required interface
const SLInterfaceID ids[1] = {SL_IID_ENVIRONMENTALREVERB};
const SLboolean req[1] = {SL_BOOLEAN_FALSE};
result = (*engineEngine)->CreateOutputMix(engineEngine, &outputMixObject, 1, ids, req);
assert(SL_RESULT_SUCCESS == result);// realize the output mix
result = (*outputMixObject)->Realize(outputMixObject, SL_BOOLEAN_FALSE);
assert(SL_RESULT_SUCCESS == result);// get the environmental reverb interface
// this could fail if the environmental reverb effect is not available,
// either because the feature is not present, excessive CPU load, or
// the required MODIFY_AUDIO_SETTINGS permission was not requested and granted
result = (*outputMixObject)->GetInterface(outputMixObject, SL_IID_ENVIRONMENTALREVERB,
&outputMixEnvironmentalReverb);
if (SL_RESULT_SUCCESS == result) {
result = (*outputMixEnvironmentalReverb)->SetEnvironmentalReverbProperties(
outputMixEnvironmentalReverb, &reverbSettings);
}
// ignore unsuccessful result codes for environmental reverb, as it is optional for this example
}void createBufferQueueAudioPlayer(JNIEnv* env, jclass clazz, int rate, int channel,int bitsPerSample)
{
SLresult result;// configure audio source
SLDataLocator_AndroidSimpleBufferQueue loc_bufq = {SL_DATALOCATOR_ANDROIDSIMPLEBUFFERQUEUE, 2};
// SLDataFormat_PCM format_pcm = {SL_DATAFORMAT_PCM, 2, SL_SAMPLINGRATE_16,
// SL_PCMSAMPLEFORMAT_FIXED_16, SL_PCMSAMPLEFORMAT_FIXED_16,
// SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT, SL_BYTEORDER_LITTLEENDIAN};
SLDataFormat_PCM format_pcm;
format_pcm.formatType = SL_DATAFORMAT_PCM;
format_pcm.numChannels = channel;
format_pcm.samplesPerSec = rate * 1000;
format_pcm.bitsPerSample = bitsPerSample;
ntainerSize = 16;
if(channel == 2)
format_pcm.channelMask = SL_SPEAKER_FRONT_LEFT | SL_SPEAKER_FRONT_RIGHT;
else
format_pcm.channelMask = SL_SPEAKER_FRONT_CENTER;
format_pcm.endianness = SL_BYTEORDER_LITTLEENDIAN;
SLDataSource audioSrc = {&loc_bufq, &format_pcm};// configure audio sink
SLDataLocator_OutputMix loc_outmix = {SL_DATALOCATOR_OUTPUTMIX, outputMixObject};
SLDataSink audioSnk = {&loc_outmix, NULL};// create audio player
const SLInterfaceID ids[3] = {SL_IID_BUFFERQUEUE, SL_IID_EFFECTSEND,
/*SL_IID_MUTESOLO,*/ SL_IID_VOLUME};
const SLboolean req[3] = {SL_BOOLEAN_TRUE, SL_BOOLEAN_TRUE,
/*SL_BOOLEAN_TRUE,*/ SL_BOOLEAN_TRUE};
result = (*engineEngine)->CreateAudioPlayer(engineEngine, &bqPlayerObject, &audioSrc, &audioSnk,
3, ids, req);
assert(SL_RESULT_SUCCESS == result);
// realize the player
result = (*bqPlayerObject)->Realize(bqPlayerObject, SL_BOOLEAN_FALSE);
assert(SL_RESULT_SUCCESS == result);// get the play interface
result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_PLAY, &bqPlayerPlay);
assert(SL_RESULT_SUCCESS == result);// get the buffer queue interface
result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_BUFFERQUEUE,
&bqPlayerBufferQueue);
assert(SL_RESULT_SUCCESS == result);// register callback on the buffer queue
result = (*bqPlayerBufferQueue)->RegisterCallback(bqPlayerBufferQueue, bqPlayerCallback, NULL);
assert(SL_RESULT_SUCCESS == result);// get the effect send interface
result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_EFFECTSEND,
&bqPlayerEffectSend);
assert(SL_RESULT_SUCCESS == result);#if 0 // mute/solo is not supported for sources that are known to be mono, as this is
// get the mute/solo interface
result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_MUTESOLO, &bqPlayerMuteSolo);
assert(SL_RESULT_SUCCESS == result);
#endif// get the volume interface
result = (*bqPlayerObject)->GetInterface(bqPlayerObject, SL_IID_VOLUME, &bqPlayerVolume);
assert(SL_RESULT_SUCCESS == result);// set the player's state to playing
result = (*bqPlayerPlay)->SetPlayState(bqPlayerPlay, SL_PLAYSTATE_PLAYING);
assert(SL_RESULT_SUCCESS == result);}
void AudioWrite(const void*buffer, int size)
{
(*bqPlayerBufferQueue)->Enqueue(bqPlayerBufferQueue, buffer, size);
}JNIEXPORT jint JNICALL Java_com_zhangjie_graduation_videopalyer_jni_VideoPlayerDecode_VideoPlayer
(JNIEnv *env, jclass clz, jstring fileName)
{
const char* local_title = (*env)->GetStringUTFChars(env, fileName, NULL);
av_register_all();//注册所有支持的文件格式以及编解码器
/*
*只读取文件头,并不会填充流信息
*/
if(avformat_open_input(&pFormatCtx, local_title, NULL, NULL) != 0)
return -1;
/*
*获取文件中的流信息,此函数会读取packet,并确定文件中所有流信息,
*设置pFormatCtx->streams指向文件中的流,但此函数并不会改变文件指针,
*读取的packet会给后面的解码进行处理。
*/
if(avformat_find_stream_info(pFormatCtx, NULL) < 0)
return -1;
/*
*输出文件的信息,也就是我们在使用ffmpeg时能够 到的文件详细信息,
*第二个参数指定输出哪条流的信息,-1代表ffmpeg自己选择。最后一个参数用于
*指定dump的是不是输出文件,我们的dump是输入文件,因此一定要为0
*/
av_dump_format(pFormatCtx, -1, local_title, 0);
int i = 0;
for(i=0; i< pFormatCtx->nb_streams; i++)
{
if(pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_AUDIO){
audioStream = i;
break;
}
}if(audioStream < 0)return -1;
aCodecCtx = pFormatCtx->streams[audioStream]->codec;
aCodec = avcodec_find_decoder(aCodecCtx->codec_id);
if(avcodec_open2(aCodecCtx, aCodec, NULL) < 0)return -1;
aFrame = avcodec_alloc_frame();
if(aFrame == NULL)return -1;
int ret;
createEngine(env, clz);
int flag_start = 0;
while(videoFlag != -1)
{
if(av_read_frame(pFormatCtx, &packet) < 0)break;
if(packet.stream_index == audioStream)
{
ret = avcodec_decode_audio4(aCodecCtx, aFrame, &frameFinished, &packet);
if(ret > 0 && frameFinished)
{
if(flag_start == 0)
{
flag_start = 1;
createBufferQueueAudioPlayer(env, clz, aCodecCtx->sample_rate, aCodecCtx->channels, SL_PCMSAMPLEFORMAT_FIXED_16);
}
int data_size = av_samples_get_buffer_size(
aFrame->linesize,aCodecCtx->channels,
aFrame->nb_samples,aCodecCtx->sample_fmt, 1);
LOGI("audioDecodec :%d : %d, :%d :%d",data_size,aCodecCtx->channels,aFrame->nb_samples,aCodecCtx->sample_rate);
(*bqPlayerBufferQueue)->Enqueue(bqPlayerBufferQueue, aFrame->data[0], data_size);
}}
usleep(5000);
while(videoFlag != 0)
{
if(videoFlag == 1)//暂停
{
sleep(1);
}else if(videoFlag == -1) //停止
{
break;
}
}
av_free_packet(&packet);
}
av_free(aFrame);
avcodec_close(aCodecCtx);
avformat_close_input(&pFormatCtx);
(*env)->ReleaseStringUTFChars(env, fileName, local_title);
}JNIEXPORT jint JNICALL Java_com_zhangjie_graduation_videopalyer_jni_VideoPlayerDecode_VideoPlayerPauseOrPlay
(JNIEnv *env, jclass clz)
{
if(videoFlag == 1)
{
videoFlag = 0;
}else if(videoFlag == 0){
videoFlag = 1;
}
return videoFlag;
}JNIEXPORT jint JNICALL Java_com_zhangjie_graduation_videopalyer_jni_VideoPlayerDecode_VideoPlayerStop
(JNIEnv *env, jclass clz)
{
videoFlag = -1;
}然后就是需要在Android.mk中添加OpenSL ES的库支持,代码如下:
LOCAL_PATH := $(call my-dir)
#######################################################
########## ffmpeg-prebuilt #######
#######################################################
#declare the prebuilt library
include $(CLEAR_VARS)
LOCAL_MODULE := ffmpeg-prebuilt
LOCAL_SRC_FILES := ffmpeg/android/armv7-a/libffmpeg-neon.so
LOCAL_EXPORT_C_INCLUDES := ffmpeg/android/armv7-a/include
LOCAL_EXPORT_LDLIBS := ffmpeg/android/armv7-a/libffmpeg-neon.so
LOCAL_PRELINK_MODULE := true
include $(PREBUILT_SHARED_LIBRARY)########################################################
## ffmpeg-test-neno.so ########
########################################################
include $(CLEAR_VARS)
TARGET_ARCH_ABI=armeabi-v7a
LOCAL_ARM_MODE=arm
LOCAL_ARM_NEON=true
LOCAL_ALLOW_UNDEFINED_SYMBOLS=false
LOCAL_MODULE := ffmpeg-test-neon
#LOCAL_SRC_FILES := jniffmpeg/VideoPlayerDecode.c
LOCAL_SRC_FILES := jniffmpeg/Decodec_Audio.cLOCAL_C_INCLUDES := $(LOCAL_PATH)/ffmpeg/android/armv7-a/include \
$(LOCAL_PATH)/ffmpeg \
$(LOCAL_PATH)/ffmpeg/libavutil \
$(LOCAL_PATH)/ffmpeg/libavcodec \
$(LOCAL_PATH)/ffmpeg/libavformat \
$(LOCAL_PATH)/ffmpeg/libavcodec \
$(LOCAL_PATH)/ffmpeg/libswscale \
$(LOCAL_PATH)/jniffmpeg \
$(LOCAL_PATH)
LOCAL_SHARED_LIBRARY := ffmpeg-prebuilt
LOCAL_LDLIBS := -llog -lGLESv2 -ljnigraphics -lz -lm $(LOCAL_PATH)/ffmpeg/android/armv7-a/libffmpeg-neon.so
LOCAL_LDLIBS += -lOpenSLES
include $(BUILD_SHARED_LIBRARY)由于OpenSLES最低版本需要9所以要在Application.mk中添加平台
# The ARMv7 is significanly faster due to the use of the hardware FPU
APP_ABI := armeabi
APP_PLATFORM := android-9
APP_STL := stlport_static
APP_CPPFLAGS += -fno-rtti
#APP_ABI := armeabi最后在终端运行ndk-build,就会将代码添加到
ffmpeg-test-neon.so这个库中
最后在Android端调用
VideoPlayer这个函数就会自动播放视频的声音,测试发现虽然声音正常但是有杂音,可能采样率设置的不对,获取其他的配置有问题,下一章着重解决这个问题,同时使用队列的方式来从视频中取音频包,然后从音频包队列中取出,然后解码播放。
在上一章 Android本地视频播放器开发--ffmpeg解码视频文件中的音频(2)【】中使用OpenSL ES来播放视频的声音,遗留的问题是声音的不同步,由于视频实现部分也要同步,而且音视频也要同步,所以音频部分就不单独同步问题,当同时播放音视频的时候进行两者的同步问题,对于视频显示部分最后是要使用OpenGL es来实现,由于SDL可以和OpenGL ES进行搭配,所以使用SDL+OpenGL ES进行显示视频,所以这章将要讲解的是SDL的交叉编译.
一、到官网下载下载2.0的源码也可以通过hg来clone最新的源码
hg clone
二、其实1.3以后的版本就添加了android的支持,所以下载完源码后,在SDL目录下创建一个jni的目录,然后将原先SDL目录下面的文件以及文件夹都拷贝到jni目录下面,在jni目录下面我们会看见一个android.mk的文件,这就是官方给我们编写的编译文件,由于手机基本上都是使用的arm编译器,所以我在里面添加了LOCAL_ARM_MODE=arm,模块的编译使用arm编译器来编译,具体的内容如下:
###########################
#
# SDL shared library
#
###########################include $(CLEAR_VARS)
LOCAL_MODULE := SDL2
LOCAL_ARM_MODE=arm
LOCAL_C_INCLUDES := $(LOCAL_PATH)/includeLOCAL_SRC_FILES := \
$(subst $(LOCAL_PATH)/,, \
$(wildcard $(LOCAL_PATH)/src/*.c) \
$(wildcard $(LOCAL_PATH)/src/audio/*.c) \
$(wildcard $(LOCAL_PATH)/src/audio/android/*.c) \
$(wildcard $(LOCAL_PATH)/src/audio/dummy/*.c) \
$(LOCAL_PATH)/src/atomic/SDL_atomic.c \
$(LOCAL_PATH)/src/atomic/SDL_spinlock.c.arm \
$(wildcard $(LOCAL_PATH)/src/core/android/*.cpp) \
$(wildcard $(LOCAL_PATH)/src/cpuinfo/*.c) \
$(wildcard $(LOCAL_PATH)/src/events/*.c) \
$(wildcard $(LOCAL_PATH)/src/file/*.c) \
$(wildcard $(LOCAL_PATH)/src/haptic/*.c) \
$(wildcard $(LOCAL_PATH)/src/haptic/dummy/*.c) \
$(wildcard $(LOCAL_PATH)/src/joystick/*.c) \
$(wildcard $(LOCAL_PATH)/src/joystick/android/*.c) \
$(wildcard $(LOCAL_PATH)/src/loadso/dlopen/*.c) \
$(wildcard $(LOCAL_PATH)/src/power/*.c) \
$(wildcard $(LOCAL_PATH)/src/power/android/*.c) \
$(wildcard $(LOCAL_PATH)/src/render/*.c) \
$(wildcard $(LOCAL_PATH)/src/render/*/*.c) \
$(wildcard $(LOCAL_PATH)/src/stdlib/*.c) \
$(wildcard $(LOCAL_PATH)/src/thread/*.c) \
$(wildcard $(LOCAL_PATH)/src/thread/pthread/*.c) \
$(wildcard $(LOCAL_PATH)/src/timer/*.c) \
$(wildcard $(LOCAL_PATH)/src/timer/unix/*.c) \
$(wildcard $(LOCAL_PATH)/src/video/*.c) \
$(wildcard $(LOCAL_PATH)/src/video/android/*.c))LOCAL_CFLAGS += -DGL_GLEXT_PROTOTYPES
LOCAL_LDLIBS := -ldl -lGLESv1_CM -lGLESv2 -lloginclude $(BUILD_SHARED_LIBRARY)
三、如果直接编译的话会提示如下错误
/home/SDL/jni/src/core/android/SDL_android.cpp:30:21: fatal error: EGL/egl.h: No such file or directory
查看EGL的支持是在2.3版本以上,所以我们需要添加平台的支持,所以需要在jni目录下面创建一个Application.mk文件来指定编译平台,内容如下:
APP_ABI := armeabi
APP_PLATFORM := android-9四、在jni目录终端下运行ndk-build,就会编译出libSDL2.so,编译过程如下:
root@zhangjie:/home/SDL/jni# ndk-build
Compile arm : SDL2 <= SDL_assert.c
Compile arm : SDL2 <= SDL.c
Compile arm : SDL2 <= SDL_error.c
Compile arm : SDL2 <= SDL_fatal.c
Compile arm : SDL2 <= SDL_hints.c
Compile arm : SDL2 <= SDL_log.c
Compile arm : SDL2 <= SDL_audio.c
Compile arm : SDL2 <= SDL_audiocvt.c
Compile arm : SDL2 <= SDL_audiodev.c
Compile arm : SDL2 <= SDL_audiotypecvt.c
Compile arm : SDL2 <= SDL_mixer.c
Compile arm : SDL2 <= SDL_wave.c
Compile arm : SDL2 <= SDL_androidaudio.c
Compile arm : SDL2 <= SDL_dummyaudio.c
Compile arm : SDL2 <= SDL_atomic.c
Compile arm : SDL2 <= SDL_spinlock.c
Compile++ arm : SDL2 <= SDL_android.cpp
Compile arm : SDL2 <= SDL_cpuinfo.c
Compile arm : SDL2 <= SDL_clipboardevents.c
Compile arm : SDL2 <= SDL_dropevents.c
Compile arm : SDL2 <= SDL_events.c
Compile arm : SDL2 <= SDL_gesture.c
Compile arm : SDL2 <= SDL_keyboard.c
Compile arm : SDL2 <= SDL_mouse.c
Compile arm : SDL2 <= SDL_quit.c
Compile arm : SDL2 <= SDL_touch.c
Compile arm : SDL2 <= SDL_windowevents.c
Compile arm : SDL2 <= SDL_rwops.c
Compile arm : SDL2 <= SDL_haptic.c
Compile arm : SDL2 <= SDL_syshaptic.c
Compile arm : SDL2 <= SDL_gamecontroller.c
Compile arm : SDL2 <= SDL_joystick.c
Compile arm : SDL2 <= SDL_sysjoystick.c
Compile arm : SDL2 <= SDL_sysloadso.c
Compile arm : SDL2 <= SDL_power.c
Compile arm : SDL2 <= SDL_syspower.c
Compile arm : SDL2 <= SDL_render.c
Compile arm : SDL2 <= SDL_yuv_mmx.c
Compile arm : SDL2 <= SDL_yuv_sw.c
Compile arm : SDL2 <= SDL_render_d3d.c
Compile arm : SDL2 <= SDL_render_gles2.c
Compile arm : SDL2 <= SDL_shaders_gles2.c
Compile arm : SDL2 <= SDL_render_gles.c
Compile arm : SDL2 <= SDL_render_gl.c
Compile arm : SDL2 <= SDL_shaders_gl.c
Compile arm : SDL2 <= SDL_render_psp.c
Compile arm : SDL2 <= SDL_blendfillrect.c
Compile arm : SDL2 <= SDL_blendline.c
Compile arm : SDL2 <= SDL_blendpoint.c
Compile arm : SDL2 <= SDL_drawline.c
Compile arm : SDL2 <= SDL_drawpoint.c
Compile arm : SDL2 <= SDL_render_sw.c
Compile arm : SDL2 <= SDL_rotate.c
Compile arm : SDL2 <= SDL_getenv.c
Compile arm : SDL2 <= SDL_iconv.c
Compile arm : SDL2 <= SDL_malloc.c
Compile arm : SDL2 <= SDL_qsort.c
Compile arm : SDL2 <= SDL_stdlib.c
Compile arm : SDL2 <= SDL_string.c
Compile arm : SDL2 <= SDL_thread.c
Compile arm : SDL2 <= SDL_syscond.c
Compile arm : SDL2 <= SDL_sysmutex.c
Compile arm : SDL2 <= SDL_syssem.c
Compile arm : SDL2 <= SDL_systhread.c
Compile arm : SDL2 <= SDL_timer.c
Compile arm : SDL2 <= SDL_systimer.c
Compile arm : SDL2 <= SDL_blit_0.c
Compile arm : SDL2 <= SDL_blit_1.c
Compile arm : SDL2 <= SDL_blit_A.c
Compile arm : SDL2 <= SDL_blit_auto.c
Compile arm : SDL2 <= SDL_blit.c
Compile arm : SDL2 <= SDL_blit_copy.c
Compile arm : SDL2 <= SDL_blit_N.c
Compile arm : SDL2 <= SDL_blit_slow.c
Compile arm : SDL2 <= SDL_bmp.c
Compile arm : SDL2 <= SDL_clipboard.c
Compile arm : SDL2 <= SDL_fillrect.c
Compile arm : SDL2 <= SDL_pixels.c
Compile arm : SDL2 <= SDL_rect.c
Compile arm : SDL2 <= SDL_RLEaccel.c
Compile arm : SDL2 <= SDL_shape.c
Compile arm : SDL2 <= SDL_stretch.c
Compile arm : SDL2 <= SDL_surface.c
Compile arm : SDL2 <= SDL_video.c
Compile arm : SDL2 <= SDL_androidclipboard.c
Compile arm : SDL2 <= SDL_androidevents.c
Compile arm : SDL2 <= SDL_androidgl.c
Compile arm : SDL2 <= SDL_androidkeyboard.c
Compile arm : SDL2 <= SDL_androidtouch.c
Compile arm : SDL2 <= SDL_androidvideo.c
Compile arm : SDL2 <= SDL_androidwindow.c
StaticLibrary : libstdc++.a
SharedLibrary : libSDL2.so
Install : libSDL2.so => libs/armeabi/libSDL2.so编译出sdl库后,我们就可以调用它的相关函数来实现我们的功能了