音视频学习记录MediaCodec系列一:编码

网上的很多音视频的文章很零散,想要自学难度很大,这里整理了一下自己的学习记录,使用的是Camera2、AudioRecord和MediaCodec

MediaCodec类Android提供的用于访问低层多媒体编/解码器接口,它是Android低层多媒体架构的一部分,通常与MediaExtractor、MediaMuxer、AudioTrack结合使用,能够编解码诸如H.264、H.265、AAC、3gp等常见的音视频格式。

一、生命周期

MediaCodec的生命周期有三种状态:

停止态-Stopped:通过调用stop()方法使MediaCodec返回到Uninitialized状态

  • 未初始化(Uninitialized),当创建了一个MediaCodec对象,此时MediaCodec处于Uninitialized状态
  • 配置(Configured),使用configure(…)方法对MediaCodec进行配置转为Configured状态
  • 错误(Error),在任何状态调用reset()方法使MediaCodec返回到Uninitialized状态

执行态-Executing:可以在Executing状态的任何时候通过调用flush()方法返回到Flushed状态。

  • 刷新(Flushed),调用start()方法后MediaCodec立即进入Flushed状态
  • 运行(Running),第一个输入缓存(input buffer)被移出队列,MediaCodec就转入Running状态
  • 流结束(End-of-Stream)当你将一个带有end-of-stream marker标记的输入缓存入队列时,MediaCodec将转入End-of-Stream子状态

释放态-Released:当使用完MediaCodec后,必须调用release()方法释放其资源

二、编/解码器

//解码器
val mVideoDecoder = MediaCodec.createDecoderByType("video/avc")
//编码器
val mVideoEncoder = MediaCodec.createEncoderByType("video/avc")

type

public static final String MIMETYPE_VIDEO_VP8 = "video/x-vnd.on2.vp8";
public static final String MIMETYPE_VIDEO_VP9 = "video/x-vnd.on2.vp9";
//视频常用格式H.264/AVC 
public static final String MIMETYPE_VIDEO_AVC = "video/avc";
public static final String MIMETYPE_VIDEO_HEVC = "video/hevc";
public static final String MIMETYPE_VIDEO_MPEG4 = "video/mp4v-es";
public static final String MIMETYPE_VIDEO_H263 = "video/3gpp";
public static final String MIMETYPE_VIDEO_MPEG2 = "video/mpeg2";
public static final String MIMETYPE_VIDEO_RAW = "video/raw";
public static final String MIMETYPE_VIDEO_DOLBY_VISION = "video/dolby-vision";
public static final String MIMETYPE_VIDEO_SCRAMBLED = "video/scrambled";

public static final String MIMETYPE_AUDIO_AMR_NB = "audio/3gpp";
public static final String MIMETYPE_AUDIO_AMR_WB = "audio/amr-wb";
public static final String MIMETYPE_AUDIO_MPEG = "audio/mpeg";
public static final String MIMETYPE_AUDIO_AAC = "audio/mp4a-latm";
public static final String MIMETYPE_AUDIO_QCELP = "audio/qcelp";
public static final String MIMETYPE_AUDIO_VORBIS = "audio/vorbis";
public static final String MIMETYPE_AUDIO_OPUS = "audio/opus";
public static final String MIMETYPE_AUDIO_G711_ALAW = "audio/g711-alaw";
public static final String MIMETYPE_AUDIO_G711_MLAW = "audio/g711-mlaw";
public static final String MIMETYPE_AUDIO_RAW = "audio/raw";
public static final String MIMETYPE_AUDIO_FLAC = "audio/flac";
public static final String MIMETYPE_AUDIO_MSGSM = "audio/gsm";
public static final String MIMETYPE_AUDIO_AC3 = "audio/ac3";
public static final String MIMETYPE_AUDIO_EAC3 = "audio/eac3";
public static final String MIMETYPE_AUDIO_SCRAMBLED = "audio/scrambled";

音视频格式https://developer.android.google.cn/reference/kotlin/android/media/MediaFormat?hl=en

三、示例

初始化

//摄像头id列表
private lateinit var idList: Array
//当前摄像头id
private lateinit var cameraId: String
//当前摄像头属性类
private lateinit var cameraCharacteristics: CameraCharacteristics
//预览Surface
private lateinit var surface: Surface
//编码器Surface
private lateinit var inputSurface: Surface
//当前摄像头
private lateinit var cameraDevice: CameraDevice
//预览CaptureRequest.Builder
private lateinit var captureRequestBuilder: CaptureRequest.Builder
//Session
private lateinit var cameraCaptureSession: CameraCaptureSession
//相机分辨率
private lateinit var size: Size
//视频编码器
private lateinit var videoEncoder: MediaCodec
//视频输出格式
private lateinit var outputFormat: MediaFormat
//音频编码器
private lateinit var audioEncoder:MediaCodec
private lateinit var audioRecord:AudioRecord
private lateinit var mBufferInfo:MediaCodec.BufferInfo
//音频编码协程
private var audioRecordJob:Job? = null

override fun init() {
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        //初始化
        CameraUtil.init(application)
        //获取摄像头id列表
        idList = CameraUtil.getCameraIdList()
        cameraId = idList[CameraUtil.index]

        //开启surfaceTextureListener监听,获取Surface
        liveTextureView.surfaceTextureListener = surfaceTextureListener
    }
}

设置MediaCodec

/**
 * 获取Surface的回调
 */
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private val surfaceTextureListener = object : TextureView.SurfaceTextureListener {
    //SurfaceTexture大小发生变化时调用
    override fun onSurfaceTextureSizeChanged(
        surfaceTexture: SurfaceTexture,
        width: Int,
        height: Int
    ) {
        //获取相机属性类
        cameraCharacteristics = CameraUtil.getCameraCharacteristics(cameraId)

        //设置预设的预览尺寸
        size = CameraUtil.setDefaultBufferSize(surfaceTexture, cameraCharacteristics)

        surface = Surface(surfaceTexture)
    }

    override fun onSurfaceTextureUpdated(surface: SurfaceTexture?) {

    }

    override fun onSurfaceTextureDestroyed(surfaceTexture: SurfaceTexture?): Boolean {
        return true
    }

    override fun onSurfaceTextureAvailable(
        surfaceTexture: SurfaceTexture,
        width: Int,
        height: Int
    ) {
        //获取相机属性类
        cameraCharacteristics = CameraUtil.getCameraCharacteristics(cameraId)

        //设置预设的预览尺寸
        size = CameraUtil.setDefaultBufferSize(surfaceTexture, cameraCharacteristics)

        surface = Surface(surfaceTexture)

        //配置视频编码格式
        videoEncoder = MediaCodecUtil.createSurfaceVideoEncoder(size)
        //获取输入Surface
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            inputSurface = MediaCodec.createPersistentInputSurface()
            videoEncoder.setInputSurface(inputSurface)
        } else {
            inputSurface = videoEncoder.createInputSurface()
        }
        //设置输出回调
        if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
            videoEncoder.setCallback(mediaCodecCallback, CameraUtil.getBackgroundHandler())
        } else {
            videoEncoder.setCallback(mediaCodecCallback)
        }
        //获取格式
        outputFormat = videoEncoder.outputFormat
        videoEncoder.start()
        
        //音频录制类
        audioRecord = AudioRecordUtil.getSingleAudioRecord(AudioFormat.CHANNEL_IN_STEREO)
        //音频编码器
        audioEncoder = MediaCodecUtil.createAudioEncoder()
        audioEncoder.start()
        
        mBufferInfo = MediaCodec.BufferInfo()

        //开启摄像头
        CameraUtil.openCamera(this@LiveActivity2, cameraId, mStateCallback)
    }
}

/**
 * 硬编码的异步回调
 */
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private val mediaCodecCallback = object : MediaCodec.Callback() {
    override fun onOutputBufferAvailable(
        codec: MediaCodec,
        index: Int,
        info: MediaCodec.BufferInfo
    ) {
        when {
            index == MediaCodec.INFO_TRY_AGAIN_LATER ->
                showToast("超时")
            index >= 0 -> {
                val encoderOutputBuffers: ByteBuffer? = videoEncoder.getOutputBuffer(index)
                val encodedData = encoderOutputBuffers?.get(index)
                videoEncoder.releaseOutputBuffer(index, true)
            }
        }
    }

    override fun onInputBufferAvailable(codec: MediaCodec, index: Int) {

    }

    override fun onOutputFormatChanged(codec: MediaCodec, format: MediaFormat) {
        outputFormat = format
    }

    override fun onError(codec: MediaCodec, e: MediaCodec.CodecException) {
        showToast("${e.message}")
    }

}

打开摄像头

/**
 * 打开摄像头的回调
 */
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private val mStateCallback = object : CameraDevice.StateCallback() {
    //成功打开时的回调,可以得到一个 CameraDevice 实例
    override fun onOpened(camera: CameraDevice) {
        cameraDevice = camera

        //创建一个预览的CaptureRequest
        captureRequestBuilder =
            CameraUtil.createPreviewCaptureRequest(cameraDevice, inputSurface)
        captureRequestBuilder.addTarget(surface)

        //创建一个Session
        cameraDevice.createCaptureSession(
            arrayListOf(inputSurface, surface),
            mSessionCallback,
            CameraUtil.getBackgroundHandler()
        )
    }

    //当 camera 不再可用时的回调,通常在该方法中进行资源释放的操作
    override fun onDisconnected(camera: CameraDevice) {
        showToast("camera不再可用")
    }

    // 当 camera 打开失败时的回调,error 为具体错误原因,通常在该方法中也要进行资源释放的操作
    override fun onError(camera: CameraDevice, error: Int) {
        camera.close()
        CameraUtil.showError(error)
        CameraUtil.releaseBackgroundThread()
    }

    //相机关闭时回调
    override fun onClosed(camera: CameraDevice) {
        super.onClosed(camera)
        cameraCaptureSession.close()
    }
}

CameraCaptureSession

/**
 * 创建预览Session的回调
 */
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private val mSessionCallback = object : CameraCaptureSession.StateCallback() {

    override fun onConfigured(session: CameraCaptureSession) {
        cameraCaptureSession = session

        // 开始预览,即设置反复请求
        cameraCaptureSession.setRepeatingRequest(
            captureRequestBuilder.build(),
            captureCallback,
            CameraUtil.getBackgroundHandler()
        )
        
        //开始录音
        startRecording()
    }

    //创建失败
    override fun onConfigureFailed(session: CameraCaptureSession) {
        showToast("创建Session失败")
    }
}

/**
 * Session进度的回调
 */
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private val captureCallback = object : CameraCaptureSession.CaptureCallback() {
    override fun onCaptureCompleted(
        session: CameraCaptureSession,
        request: CaptureRequest,
        result: TotalCaptureResult
    ) {
        super.onCaptureCompleted(session, request, result)
    }

    override fun onCaptureFailed(
        session: CameraCaptureSession,
        request: CaptureRequest,
        failure: CaptureFailure
    ) {
        super.onCaptureFailed(session, request, failure)
        when (failure.reason) {
            CaptureFailure.REASON_ERROR -> {
                showToast("Capture failed: REASON_ERROR")
                cameraCaptureSession.close()
            }
            CaptureFailure.REASON_FLUSHED -> showToast("Capture failed: REASON_FLUSHED")
            else -> showToast("Capture failed: UNKNOWN")
        }
    }
}

/**
 * 开始录音
 */
@RequiresApi(Build.VERSION_CODES.LOLLIPOP)
private fun startRecording(){
    //开始录音
    audioRecord.startRecording()
    val mAudioBuffer = byteArrayOf()
    audioRecordJob = GlobalScope.launch (Dispatchers.IO){
        while (isActive){
            val length =  AudioRecordUtil.getBufferSizeInBytes()
            audioRecord.read(mAudioBuffer,0,length)

            val inputIndex = audioEncoder.dequeueInputBuffer(0)
            if (inputIndex >= 0){
                val byteBuffer = audioEncoder.getInputBuffer(inputIndex)
                if (byteBuffer!=null){
                    byteBuffer.clear()
                    byteBuffer.put(mAudioBuffer)
                    byteBuffer.limit(length);// 设定上限值
                    audioEncoder.queueInputBuffer(inputIndex, 0, length, System.nanoTime(), 0); // 第三个参数为时间戳,这里是使用当前
                }
            }

            val outputIndex = audioEncoder.dequeueOutputBuffer(mBufferInfo,0)
            if (outputIndex >= 0){
                val byteBuffer = audioEncoder.getOutputBuffer(outputIndex)
                if (byteBuffer != null){
                    val byte = byteBuffer.get(outputIndex)
                }
                audioEncoder.releaseOutputBuffer(outputIndex,false)
            }
        }
    }
}

自适应比例

/**
 * 自动修改textureView宽高以适应不同预览比例
 */
override fun onWindowFocusChanged(hasFocus: Boolean) {
    super.onWindowFocusChanged(hasFocus)
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        val width = liveTextureView.width
        val height = liveTextureView.height
        val proportion1 = size.width.toFloat() / size.height
        val proportion2 = height.toFloat() / width
        if (proportion1 > proportion2) {
            val layoutParams = liveTextureView.layoutParams
            layoutParams.width = (height * proportion1 + .5).toInt()
            liveTextureView.layoutParams = layoutParams
        } else if (proportion1 < proportion2) {
            val layoutParams = liveTextureView.layoutParams
            layoutParams.height = (width * proportion1 + .5).toInt()
            liveTextureView.layoutParams = layoutParams
        }
    }
}

销毁

override fun onDestroy() {
    super.onDestroy()
    if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.LOLLIPOP) {
        cameraDevice.close()
        surface.release()
        inputSurface.release()
        videoEncoder.stop()
        videoEncoder.release()
        audioEncoder.stop()
        audioEncoder.release()
        audioRecord.stop()
        audioRecord.release()
        audioRecordJob?.cancel()
        CameraUtil.releaseBackgroundThread()
    }
}

工具类

object CameraUtil {

    private lateinit var mBackgroundThread: HandlerThread
    private var mBackgroundHandler: Handler? = null

    //摄像头id列表
    private lateinit var cameraIdList: Array

    //第几个相机
    var index = 0

    private lateinit var application: Application

    private lateinit var cameraManager: CameraManager

    /**
     * 获取cameraManager
     */
    fun getCameraManager(): CameraManager {
        return cameraManager
    }

    /**
     * 初始化
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun init(application: Application) {
        this.application = application;
        cameraManager = application.getSystemService(Context.CAMERA_SERVICE) as CameraManager
        cameraIdList = cameraManager.cameraIdList
    }

    /**
     * 获取摄像头id列表
     */
    fun getCameraIdList(): Array {
        return cameraIdList
    }

    /**
     * 获取CameraCharacteristics相机属性类
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun getCameraCharacteristics(cameraId: String): CameraCharacteristics {
        return cameraManager.getCameraCharacteristics(cameraId)
    }

    /**
     * 获取预览尺寸
     * 参数2:预览尺寸比例,如4:3,16:9
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun getPreviewSize(@NotNull cameraCharacteristics: CameraCharacteristics, aspectRatio: Float): Size? {
        val streamConfigurationMap =
            cameraCharacteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP)
        val supportedSizes = streamConfigurationMap.getOutputSizes(SurfaceTexture::class.java)
        for (size in supportedSizes) {
            if (size.width.toFloat() / size.height == aspectRatio) {
                return size
            }
        }
        return null
    }

    /**
     * 获取预览尺寸
     * 参数2:预览尺寸比例的集合,按加入顺序寻找预览尺寸并返回
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun getPreviewSize(@NotNull cameraCharacteristics: CameraCharacteristics, aspectRatios: ArrayList): Size {
        for (aspectRatio in aspectRatios) {
            val size = getPreviewSize(cameraCharacteristics, aspectRatio)
            if (size != null) {
                return size
            }
        }
        return Size(1280, 720)
    }

    /**
     * 获取预设的预览尺寸
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun getPreviewSize(@NotNull cameraCharacteristics: CameraCharacteristics): Size {
        val aspectRatios = ArrayList()
        aspectRatios.add(4.toFloat() / 3)
        aspectRatios.add(16.toFloat() / 9)

        aspectRatios.add(18.toFloat() / 9)
        return getPreviewSize(cameraCharacteristics, aspectRatios)
    }

    /**
     * 设置预设的预览尺寸
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun setDefaultBufferSize(@NotNull surfaceTexture: SurfaceTexture, cameraCharacteristics: CameraCharacteristics): Size {
        val size = getPreviewSize(cameraCharacteristics)
        surfaceTexture.setDefaultBufferSize(size.width, size.height)
        return size
    }

    /**
     * 打开摄像头
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun openCamera(activity: Activity, cameraId: String, callback: CameraDevice.StateCallback) {
        if (ContextCompat.checkSelfPermission(
                activity,
                Manifest.permission.CAMERA
            ) == PackageManager.PERMISSION_GRANTED
        ) {
            cameraManager.openCamera(cameraId, callback, getBackgroundHandler())
        } else {
            val dialog = AlertDialog.Builder(activity)
            dialog.setTitle("开启相机失败").setMessage("缺少开启相机的权限").setCancelable(false)

            dialog.setNegativeButton("取消") { _, _ ->

            }

            dialog.setPositiveButton("授权") { _, _ ->
                val intent = Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS)
                intent.data = Uri.parse("package:" + activity.packageName)
                activity.startActivity(intent)
            }

            dialog.show()
        }
    }

    /**
     * 获取拍照产生的Bitmap
     */
    @RequiresApi(Build.VERSION_CODES.KITKAT)
    fun getImageReaderBitmap(it: ImageReader): Bitmap {
        val image = it.acquireLatestImage()
        val byteBuffer = image.planes[0].buffer
        val bytes = ByteArray(byteBuffer.remaining())
        byteBuffer.get(bytes)
        //清空imageReader
        image.close()
        return BitmapFactory.decodeByteArray(bytes, 0, bytes.size)
    }

    /**
     * 保存到相册
     */
    fun saveAlbum(temp: Bitmap) {
        //保存到系统相册
        MediaStore.Images.Media.insertImage(
            application.contentResolver,
            temp,
            "image_file",
            "file"
        )
    }

    /**
     * 创建一个预览的CaptureRequest
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun createPreviewCaptureRequest(
        cameraDevice: CameraDevice,
        surface: Surface
    ): CaptureRequest.Builder {
        //创建一个预览的CaptureRequest
        val captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW)
        // 设置预览输出的 Surface
        captureRequestBuilder.addTarget(surface)
        return captureRequestBuilder
    }

    /**
     * 创建一个拍照的CaptureRequest
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun createPhotographCaptureRequest(
        cameraDevice: CameraDevice,
        surface: Surface
    ): CaptureRequest.Builder {
        //创建一个拍照的CaptureRequest
        val captureRequestBuilder =
            cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE)
        // 设置拍照输出的 Surface
        captureRequestBuilder.addTarget(surface)
        return captureRequestBuilder
    }

    /**
     * 创建一个录像的CaptureRequest
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun createVideoCaptureRequest(
        cameraDevice: CameraDevice,
        surface: Surface,
        mediaRecorderSurface: Surface
    ): CaptureRequest.Builder {
        //创建一个拍照的CaptureRequest
        val captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD)
        // 设置拍照输出的 Surface
        captureRequestBuilder.addTarget(surface)
        captureRequestBuilder.addTarget(mediaRecorderSurface)
        return captureRequestBuilder
    }

    /**
     * 设置相机的自动模式
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun automaticMode(captureRequestBuilder: CaptureRequest.Builder) {
        // 自动对焦
        captureRequestBuilder.set(
            CaptureRequest.CONTROL_AF_MODE,
            CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE
        )
        //自动闪光
        captureRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON)
        //自动白平衡
        captureRequestBuilder.set(
            CaptureRequest.CONTROL_AWB_MODE,
            CaptureRequest.CONTROL_AWB_MODE_AUTO
        )
    }

    /**
     * 设置获取的照片方向
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun photoDirection(captureRequestBuilder: CaptureRequest.Builder, angle: Int) {
        // 根据设备方向计算设置照片的方向
        captureRequestBuilder.set(CaptureRequest.JPEG_ORIENTATION, angle)
    }

    /**
     * 切换相机
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun switchCamera(activity: Activity, callback: CameraDevice.StateCallback): String {
        if (index < cameraIdList.size - 1) {
            index++
        } else {
            index = 0
        }
        val cameraId = cameraIdList[index]
        openCamera(activity, cameraId, callback)
        return cameraId
    }

    /**
     * 获取录制视频的surface
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun getMediaRecorder(surface:Surface,size: Size): MediaRecorder {
        val mediaRecorder = MediaRecorder()

        //Initialized状态
        //设置声音来源,一般传入 MediaRecorder. AudioSource.MIC参数指定录制来自麦克风的声音。
        mediaRecorder.setAudioSource(MediaRecorder.AudioSource.MIC)
        //设置用于录制的视频来源surface。
        mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE)

        //设置所录制的音视频文件的格式。进入DataSourceConfigured状态
        mediaRecorder.setOutputFormat(MediaRecorder.OutputFormat.MPEG_4)
        //设置所录制的声音的编码格式。
        mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC)
        //设置录制的音频文件的保存位置。
        mediaRecorder.setOutputFile(getFilePath("${System.currentTimeMillis()}.mp4"))

        //设置所录制视频的编码位率。
        mediaRecorder.setVideoEncodingBitRate(10000000)
        //设置录制视频的捕获帧速率。
        mediaRecorder.setVideoFrameRate(30)
        //设置要拍摄的宽度和视频的高度。
        mediaRecorder.setVideoSize(size.width, size.height)
        //设置所录制视频的编码格式。
        mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.H264)

        //设置录制视频的方向
        setVideoOrientation(mediaRecorder)
        //设置使用哪个SurfaceView来显示视频预览。
        mediaRecorder.setPreviewDisplay(surface)

        //准备录制。
        mediaRecorder.prepare()

        return mediaRecorder
    }

    /**
     * 设置录制视频的方向
     */
    fun setVideoOrientation(mediaRecorder: MediaRecorder) {
        if (index == 0) {
            //设置输出的视频播放的方向提示。
            mediaRecorder.setOrientationHint(90)
        } else {
            //设置输出的视频播放的方向提示。
            mediaRecorder.setOrientationHint(90 * 3)
        }
    }

    /**
     * 获取BackgroundHandler
     */
    fun getBackgroundHandler(): Handler {
        if (mBackgroundHandler == null) {
            //设置摄像头线程
            mBackgroundThread = HandlerThread("CameraBackground")
            mBackgroundThread.start()
            mBackgroundHandler = Handler(mBackgroundThread.looper)
        }
        return mBackgroundHandler as Handler
    }

    /**
     * 释放线程资源
     */
    fun releaseBackgroundThread() {
        mBackgroundHandler?.removeCallbacksAndMessages(null)
        mBackgroundHandler = null
        mBackgroundThread.quitSafely()
        mBackgroundThread.join()
    }

    /**
     * 开启摄像头错误提示
     */
    fun showError(error: Int) {
        when (error) {
            CameraDevice.StateCallback.ERROR_CAMERA_IN_USE -> {
                showToast("当前相机设备已经在一个更高优先级的地方打开了")
            }
            CameraDevice.StateCallback.ERROR_MAX_CAMERAS_IN_USE -> {
                showToast("已打开相机数量到上限了,无法再打开新的相机了")
            }
            CameraDevice.StateCallback.ERROR_CAMERA_DISABLED -> {
                showToast("由于相关设备策略该相机设备无法打开")
            }
            CameraDevice.StateCallback.ERROR_CAMERA_DEVICE -> {
                showToast("相机设备发生了一个致命错误")
            }
            CameraDevice.StateCallback.ERROR_CAMERA_SERVICE -> {
                showToast("相机服务发生了一个致命错误")
            }
        }
    }

    /**
     * 文件路径
     */
    private fun getFilePath(fileName: String): String {
        val path: String
        //判断SD卡是否可用
        if (Environment.MEDIA_MOUNTED == Environment.getExternalStorageState() || !Environment.isExternalStorageRemovable()) {
            //路径为:/mnt/sdcard/Android/data/< package name >/files/…
            path = application.getExternalFilesDir(Environment.DIRECTORY_PICTURES)!!.absolutePath
        } else {
            //路径是:/data/data/< package name >/files/…
            path = application.filesDir.absolutePath
        }
        return path + File.separator + fileName
    }

    private fun showToast(msg: String) {
        Toast.makeText(application, msg, Toast.LENGTH_LONG).show()
    }

}

工具类

object MediaCodecUtil {

    /**
     * 配置视频编码格式
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun createSurfaceVideoEncoder(size:Size):MediaCodec{
        //视频编码器
        val videoEncoder = MediaCodec.createEncoderByType("video/avc")
        // 创建视频MediaFormat
        val videoFormat = MediaFormat.createVideoFormat("video/avc", size.width, size.height)

        // 指定编码器颜色格式
        videoFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface)
        // 仅编码器指定比特率
        videoFormat.setInteger(MediaFormat.KEY_BIT_RATE, 1250000)//4* 1024

        // 编码器必须指定帧率
        videoFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30)
        // 指定关键帧时间间隔
        videoFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5)
        //BITRATE_MODE_CBR输出码率恒定,BITRATE_MODE_CQ保证图像质量,BITRATE_MODE_VBR图像复杂则码率高,图像简单则码率低
        videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE,MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR)
        videoFormat.setInteger(MediaFormat.KEY_COMPLEXITY,MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR)

        videoEncoder.configure(videoFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
        return videoEncoder
    }

    /**
     * 配置音频编码格式
     */
    @RequiresApi(Build.VERSION_CODES.LOLLIPOP)
    fun createAudioEncoder():MediaCodec{
        //音频编码器
        val audioEncoder = MediaCodec.createEncoderByType("audio/mp4a-latm")
        // 创建音频MediaFormat,参数2:采样率,参数3:通道
        val audioFormat = MediaFormat.createAudioFormat("audio/mp4a-latm", 44100, 1)

        // 仅编码器指定比特率
        audioFormat.setInteger(MediaFormat.KEY_BIT_RATE, 4* 1024)

        var bufferSizeInBytes = AudioRecordUtil.getBufferSizeInBytes()
        if (bufferSizeInBytes == 0){
            bufferSizeInBytes = AudioRecord.getMinBufferSize(
                44100,
                CHANNEL_IN_STEREO,
                ENCODING_PCM_16BIT
            )
        }

        //可选的,输入数据缓冲区的最大大小
        audioFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, bufferSizeInBytes)

        audioFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC)

        audioEncoder.configure(audioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE)
        return audioEncoder
    }

}

工具类

object AudioRecordUtil {

    // 音频源:音频输入-麦克风
    private const val AUDIO_INPUT = MediaRecorder.AudioSource.MIC

    // 采样率
    // 44100是目前的标准,但是某些设备仍然支持22050,16000,11025
    // 采样频率一般共分为22.05KHz、44.1KHz、48KHz三个等级
    private const val AUDIO_SAMPLE_RATE = 44100

    // 音频通道 单声道
    private const val AUDIO_CHANNEL = AudioFormat.CHANNEL_IN_MONO

    // 音频通道 立体声:CHANNEL_OUT_STEREO或CHANNEL_IN_STEREO
    private const val AUDIO_CHANNEL2 = AudioFormat.CHANNEL_IN_STEREO

    // 音频格式:PCM编码
    private const val AUDIO_ENCODING = AudioFormat.ENCODING_PCM_16BIT

    private var bufferSizeInBytes:Int = 0

    /**
     * 获取缓冲大小
     */
    fun getBufferSizeInBytes():Int{
        return bufferSizeInBytes
    }

    /**
     * 默认获取单声道AudioRecord
     */
    fun getSingleAudioRecord(
        channelConfig: Int = AUDIO_CHANNEL,
        audioSource: Int = AUDIO_INPUT,
        sampleRateInHz: Int = AUDIO_SAMPLE_RATE,
        audioFormat: Int = AUDIO_ENCODING
    ): AudioRecord {
        //audioRecord能接受的最小的buffer大小
        bufferSizeInBytes = AudioRecord.getMinBufferSize(sampleRateInHz, channelConfig, audioFormat)
        return AudioRecord(
            audioSource,
            sampleRateInHz,
            channelConfig,
            audioFormat,
            bufferSizeInBytes
        )
    }

}

下一篇会继续更新解码篇,相关工具类会不断完善,如果你有不错的想法互相讨论一下

你可能感兴趣的:(Camera2,音视频开发)