[RK3288][Android6.0] 一个例子看MediaCodec使用流程

Platform: Rockchip
OS: Android 6.0
Kernel: 3.10.92

目录

    • 编码过程
    • 主要步骤
    • 例子源代码
      • 主代码部分
      • 编码部分
    • 参考

一个使用MediaCodec的例子,正好可以从例子看编码的过程,可以借鉴参考。

编码过程:

从Camera preview callback里拿数据 -> 数据格式转换 -> 编码输入 -> 处理 -> 编码输出

主要步骤:

以下是例子使用MeidaCodec的主要步骤,我把它摘出来了,详细部分见后面源代码。

  • mediaCodec = MediaCodec.createEncoderByType("video/avc");

编码类型是avc

  • MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width,height);

根据宽高创建MediaFormat,设置参数需要MediaFormat.

  • mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
    MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);

根据当前需求设置参数

  • mediaCodec.configure(mediaFormat, null, null,
    MediaCodec.CONFIGURE_FLAG_ENCODE);

参数2表示编码后输出到哪个surface,这里编码直接输出到buffer,所以可以为NULL。
参数4设置MediaCodec.CONFIGURE_FLAG_ENCODE表示配置成编码器。

  • mediaCodec.start();

开始运行

  • avcCodec.offerEncoder(arg0,h264);

Camera preview callback拿到数据arg0后送到编码器,编码完成后输出到h264缓存里。
swapYV12toI420(input, yuv420, m_width, m_height);
由于camera输出的数据格式和编码器支持的输入数据格式不一致,需要先转换。

  • mediaCodec.getInputBuffers(); inputBufferIndex =
    mediaCodec.dequeueInputBuffer(-1); ByteBuffer inputBuffer =
    inputBuffers[inputBufferIndex];

获取输入空闲buffer,然后得到空闲buffer中的一个index,这样就可以得到要使用的buffer了

  • inputBuffer.put(yuv420);
    mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0,
    0);

输入入队准备后就开始编码处理了

  • ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
    outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
    outputBuffer = outputBuffers[outputBufferIndex];

同样的方式,编码处理完成取出编码后的数据buffer

  • mediaCodec.releaseOutputBuffer(outputBufferIndex, false);

当前buffer用好了就释放掉给下一轮编码使用

  • FileOut.write(output,0,pos);

编码后的数据可以做拿去解码,网络发送等,这里就写到文件保存下来。

例子源代码:

主代码部分:

package com.example.com.rk.dec_enc;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;

import android.graphics.ImageFormat;
import android.hardware.Camera;
import android.hardware.Camera.PreviewCallback;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.os.Bundle;
import android.annotation.SuppressLint;
import android.app.Activity;
import android.util.Log;
import android.view.Menu;
import android.view.SurfaceHolder;
import android.view.SurfaceHolder.Callback;
import android.view.SurfaceView;
import android.view.View;

@SuppressLint("NewApi") public class MainActivity extends Activity implements PreviewCallback {

    AvcEncoder avcCodec = null;
    AvcDecoder avcDecodec = null;
    boolean avcDecFlag = false;
    boolean avcEncFlag = false;
    //boolean RecordCameraFrameDataFlag = true;
    boolean RecordCameraFrameDataFlag = false;
    public Camera m_camera;  
    SurfaceView   m_prevewview;
    SurfaceHolder m_surfaceHolder;
    SurfaceView   mDecSurfaceView;
    SurfaceHolder mDecSurfaceHolder;

    MediaPlayer mMediaPlayer;

    private SurfaceHolder.Callback mSurfaceCallback;


    int width = 1280;
    int height = 720;
    int framerate = 30;
    int bitrate = 500000;
    FileOutputStream FileOut = null; 
    byte[] h264 = new byte[width*height*3/2];


    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        m_prevewview = (SurfaceView) findViewById(R.id.SurfaceViewPlay);
        mDecSurfaceView = (SurfaceView) findViewById(R.id.SurfaceViewPlay_Dec);

        mSurfaceCallback = new SurfaceHolder.Callback(){



            @Override
            public void surfaceChanged(SurfaceHolder arg0, int arg1, int arg2,
                    int arg3) {
                // TODO Auto-generated method stub

            }

            @Override
            public void surfaceCreated(SurfaceHolder arg0) {
                // TODO Auto-generated method stub
                Log.d("Hery", "arg0 =" + arg0);
                mDecSurfaceHolder = mDecSurfaceView.getHolder(); // Bind SurfaceView. get SurfaceHolder context
                m_surfaceHolder = m_prevewview.getHolder(); // Bind SurfaceView get SurfaceHolder context
                mDecSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
                m_surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

                boolean m_surfaceHolderisCreat = m_surfaceHolder.isCreating();
                Log.d("Hery", "m_surfaceHolder =" + m_surfaceHolderisCreat);
                boolean mmDecSurfaceHolderisCreat = mDecSurfaceHolder.isCreating();
                Log.d("Hery", "mmDecSurfaceHolderisCreat =" + mmDecSurfaceHolderisCreat);
                if(mmDecSurfaceHolderisCreat == true && (avcDecodec== null) && (avcDecFlag == false))
                {
                    Log.d("Hery", "Create AvcDecoder");
                    avcDecodec = new AvcDecoder(width,height,mDecSurfaceHolder);
                    avcDecFlag = true;
                }
                if(m_surfaceHolderisCreat == true && (avcCodec == null) && (avcEncFlag == false))
                {
                    Log.d("Hery", "Create Camera");
                    avcCodec = new AvcEncoder(width,height,framerate,bitrate);
                    avcEncFlag = true;

                    m_camera = Camera.open();

                    try {
                        m_camera.setPreviewDisplay(m_surfaceHolder);
                    } catch (IOException e) {
                        // TODO Auto-generated catch block
                        e.printStackTrace();
                    }

                    Camera.Parameters parameters = m_camera.getParameters();
                    parameters.setPreviewSize(width, height);
                    parameters.setPictureSize(width, height);
                    parameters.setPreviewFormat(ImageFormat.YV12);
                    m_camera.setParameters(parameters); 
                    m_camera.setPreviewCallback((PreviewCallback)MainActivity.this);
                    m_camera.startPreview();    

                }

            }

            @Override
            public void surfaceDestroyed(SurfaceHolder arg0) {
                // TODO Auto-generated method stub
                Log.d("Hery","surfaceDestroyed");

            }


        };


        mDecSurfaceView.getHolder().addCallback(mSurfaceCallback);
        m_prevewview.getHolder().addCallback(mSurfaceCallback);
        m_prevewview.setVisibility(View.VISIBLE);
        mDecSurfaceView.setVisibility(View.VISIBLE);
        if(RecordCameraFrameDataFlag)
        {
            try {
                FileOut = new FileOutputStream(new File("/sdcard/app_camera.yuv"));
            } catch (FileNotFoundException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }


    }


    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }


    @Override
    public void onPreviewFrame(byte[] arg0, Camera arg1) {
        //Log.d("Hery", "onPreviewFrame");
        if(RecordCameraFrameDataFlag)
        {
            try {
                FileOut.write(arg0,0,arg0.length);
            } catch (IOException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
        // TODO Auto-generated method stub
        int size = avcCodec.offerEncoder(arg0,h264);
        avcDecodec.pushData(h264,size);
    }   
}

编码部分:

package com.example.com.rk.dec_enc;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;

import android.annotation.SuppressLint;
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.util.Log;

@SuppressLint("NewApi") public class AvcEncoder {
    private MediaCodec mediaCodec;

    int m_width;
    int m_height;
    //boolean RecordEncDataFlag = true;
    boolean RecordEncDataFlag = false;
    byte[] m_info = null;
    FileOutputStream FileOut = null; 

    private byte[] yuv420 = null; 
    public AvcEncoder(int width, int height, int framerate, int bitrate) { 
        Log.d("Hery", "AvcEncoder IN");
        m_width  = width;
        m_height = height;
        yuv420 = new byte[width*height*3/2];


        mediaCodec = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", width, height);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, bitrate);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, framerate);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);    
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 1);

        mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mediaCodec.start();
        if(RecordEncDataFlag)
        {
            try {
                FileOut = new FileOutputStream(new File("/sdcard/app_camera_enc.h264"));
            } catch (FileNotFoundException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
    }


    public void close() {
        try {
            mediaCodec.stop();
            mediaCodec.release();
        } catch (Exception e){ 
            e.printStackTrace();
        }

        try {
            FileOut.close();
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }
    }


    public int offerEncoder(byte[] input, byte[] output) 
    {   
        //Log.d("......................................................Hery", "Encoder in");
        int pos = 0;
        swapYV12toI420(input, yuv420, m_width, m_height);
        try {
            ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
            ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
            int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
            //Log.d("......................................................Hery", "inputBufferIndex = " +inputBufferIndex);
            if (inputBufferIndex >= 0) 
            {
                ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
                inputBuffer.clear();
                inputBuffer.put(yuv420);
                mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
            }

            MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
            int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
            //Log.d("......................................................Hery", "outputBufferIndex = " +outputBufferIndex);
            while (outputBufferIndex >= 0) 
            {
                ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
                byte[] outData = new byte[bufferInfo.size];
                outputBuffer.get(outData);

                if(m_info != null)
                {               
                    System.arraycopy(outData, 0,  output, pos, outData.length);
                    pos += outData.length;

                }

                else
                {
                     ByteBuffer spsPpsBuffer = ByteBuffer.wrap(outData);  
                     if (spsPpsBuffer.getInt() == 0x00000001) 
                     {  
                         m_info = new byte[outData.length];
                         System.arraycopy(outData, 0, m_info, 0, outData.length);
                         System.arraycopy(outData, 0, output, pos, outData.length);
                         pos+=outData.length;
                     } 
                     else 
                     {  
                            return -1;
                     }      
                }

                mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
                outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
            }

            if(output[4] == 0x65) //key frame
            {
                System.arraycopy(output, 0,  yuv420, 0, pos);
                System.arraycopy(m_info, 0,  output, 0, m_info.length);
                System.arraycopy(yuv420, 0,  output, m_info.length, pos);
                pos += m_info.length;
            }

        } catch (Throwable t) {
            t.printStackTrace();
        }
        //Log.d("......................................................Hery", "Encoder out");

        if(RecordEncDataFlag)
        {
            try {
                FileOut.write(output,0,pos);
            } catch (IOException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }
        return pos;
    }

    private void swapYV12toI420(byte[] yv12bytes, byte[] i420bytes, int width, int height) 
    {      
        System.arraycopy(yv12bytes, 0, i420bytes, 0,width*height);
        System.arraycopy(yv12bytes, width*height+width*height/4, i420bytes, width*height,width*height/4);
        System.arraycopy(yv12bytes, width*height, i420bytes, width*height+width*height/4,width*height/4);  
    } 
}

参考:

Android中直播视频技术探究之—基础知识大纲介绍

你可能感兴趣的:(编码,callback,Camera,mediacodec,Preview)