WebRTC android 端支持H264编解码

一、WebRTC源码中默认使用的H264编解码的库

1、WebRTC源码的video_coding模块中,包含了H264编解码相关的类

WebRTC android 端支持H264编解码_第1张图片

打开画红线的两个头文件,分别可以看到解码类中导入了ffmpeg的avcodec.h,

extern "C" {
#include "third_party/ffmpeg/libavcodec/avcodec.h"
}  // extern "C"

编码类中导入了openh264库的相关文件,来进行编码

#include "third_party/openh264/src/codec/api/svc/codec_api.h"
#include "third_party/openh264/src/codec/api/svc/codec_app_def.h"
#include "third_party/openh264/src/codec/api/svc/codec_def.h"
#include "third_party/openh264/src/codec/api/svc/codec_ver.h"

二、相关库是如何被用的源码中的

在源码的的src/modules/third_party的目录中,可以看到openh264和ffmpeg的身影,而third_party中包含的正是WebRTC依赖的一些第三方库,在下载源码的同时被下载下来的。它们是如何被编译并和WebRTC关联起来的呢,还要从WebRTC的使用的编译相关的工具开始说起。

三、WebRTC的构建工具

depot_tools:谷歌源码管理和构建的一个工具包,其中包括了gclient (源码的下载),gn(生成构建脚本),ninja(源码的编译)等工具。和我们编译源码相关的是gn和ninja.

1、gn(Generate Ninja)

1)、简介

gn命令处理的是名称为 BUILD.gn 的文件,BUILD.gn 文件中可以定义若干参数,这些参数使 gn 命令执行时可以通过不同的参数值创建不同的编译配置。例如WebRTC的目标系统是在android上还是ios上。

gn使用文件目录层次来组织不同的编译目标, WebRTC 的目录结构下每个文件夹下面都对应着一个 BUILD.gn 的文件,含有 BUILD.gn 文件表示这个目录下是有编译目标的,这些编译目标可以依赖子目录的编译目标从而组成一套复杂而有序的构建图。

2)、基础语法

就不在这里多说,请自行参考网上的其它相关文章以及官方文档

https://www.chromium.org/developers/gn-build-configuration

https://chromium.googlesource.com/chromium/src/+/master/tools/gn/docs/reference.md

https://chromium.googlesource.com/chromium/src/+/master/tools/gn/docs/quick_start.md

3)、常用命令

使用 gn 生成 ninja 构建文件:

gn gen out/Debug(Release)  --args="is_debug=false   target_os="android""

默认是生成debug版本构建文件,release版本需要在--args中添加 is_debug=false参数。

通过  --args  可以传递参数给 gn ,具体参数的含义,由 WebRTC 的构建系统来解释。比如 is_debug 选项,决定构建 debug 还是 release 版本。  target_os="android"表示目标平台是android。

如果你已经使用 gn gen 生成过构建文件,想看看这个版本的构建文件都指定了什么参数,可以使用命令:gn args out/Release --list。它会列出所有的 build arguments 和对应的文档,以及当前值。

4)整体构建流程

以WebRTC源码为例子:

在src目录下有一个.gn文件和BUILD.gn文件。

.gn文件,是WebRTC的顶层gn文件,其中指定了buildconfig = "//build/config/BUILDCONFIG.gn"文件.

//build/config/BUILDCONFIG.gn文件,这是主GN构建的配置文件,此文件在构建目录如( gn gen out/Debug,中的out/Debug目录)中的 args.gn 和 src/.gn文件之后加载。此文件被执行时生成的结果,将被应用到所有在此次构建中的其它文件。所以,此文件中定义的变量,将是全局的,BUILDCONFIG.gn主要制定了

当执行 gn gen out/Debug时,gn 工具根据 相关文件生成 ninja 文件并将这些 ninja 文件放到指定的构建目录中。具体相关执行流程如下图:

WebRTC android 端支持H264编解码_第2张图片

2、ninja

1)简介

gn的输出就是扩展名为 .ninja 的文件,这些文件保存在编译目录中,可以被 ninja 命令直接使用,ninja文件中的指令都是简单和明确的,不需要任何额外的逻辑判断和计算,这使得ninja具有小而快的特点,也是ninja本身的设计初衷。

编译所有目标

官网上的编译打包命令:

cd webrtc_android/src

gn gen out/Debug --args='target_os="android" target_cpu="arm"' 

ninja -C out/Debug

2)、文档 https://ninja-build.org/

3)、工作原理

后缀为 ninja(*.ninja) 的文件是 ninja 的 构建文件。对 WebRTC 来讲,执行完 gn gen 之后,会在 out/Release 下生成 build.ninja 文件,可以把这个文件看做是整个 WebRTC 的“ Makefile ”。它里面调用了各个模块的 ninja 文件。

要完整编译 WebRTC ,只要在 src 目录执行下列命令:ninja -C out/Release。-C 选项告诉 ninja ,进入 out/Release 目录来编译。所以,它等同于:cd out/Releaseninja

要编译某个模块,可以在 ninja 命令后跟模块名字(build.ninja文件中定义的构建目标,就像 Makefile 中的构建目标一样)。比如:

 webrtc/pcninja pc (构建pc模块)

webrtc/medianinja media(构建media模块)

3、webrtc_h264库

在src/modules/video_coding  模块中的BUILD.gn文件中,可以看到webrtc_h264这个库的相关配置,在sources包含的源文件中,又有对openh264的依赖。在rtc_use_h264这个条件判断中,可以看到对ffmpeg的依赖。

rtc_library为GN自定以的template,关于自定义template,请自行参考GN官方文档或网上的相关文章。

rtc_library("webrtc_h264") {
  visibility = [ "*" ]
  sources = [
    "codecs/h264/h264.cc",
    "codecs/h264/h264_color_space.cc",
    "codecs/h264/h264_color_space.h",
    "codecs/h264/h264_decoder_impl.cc",
    "codecs/h264/h264_decoder_impl.h",
    "codecs/h264/h264_encoder_impl.cc",
    "codecs/h264/h264_encoder_impl.h",
    "codecs/h264/include/h264.h",
  ]

  defines = []
  deps = [
    ":video_codec_interface",
    ":video_coding_utility",
    "../../api/video:video_frame",
    "../../api/video:video_frame_i010",
    "../../api/video:video_rtp_headers",
    "../../api/video_codecs:video_codecs_api",
    "../../common_video",
    "../../media:rtc_h264_profile_id",
    "../../media:rtc_media_base",
    "../../rtc_base",
    "../../rtc_base:checks",
    "../../rtc_base/system:rtc_export",
    "../../system_wrappers:field_trial",
    "../../system_wrappers:metrics",
    "//third_party/libyuv",
  ]
  absl_deps = [
    "//third_party/abseil-cpp/absl/strings",
    "//third_party/abseil-cpp/absl/types:optional",
  ]

  if (rtc_use_h264) {
    deps += [
      "//third_party/ffmpeg",
      "//third_party/openh264:encoder",
    ]
    if (!build_with_mozilla) {
      deps += [ "../../media:rtc_media_base" ]
    }
  }
}

四、WebRTC支持h264编解码

本方案,只针对采用openh264编码和ffmpeg解码,其它方案,请自行探索。

1、通过args参数,设置rtc_user_h264=true 和ffmpeg_branding="Chrome".

以下命令行将打包成libwebrtc.aar文件,可以直接使用

tools_webrtc/android/build_aar.py --build-dir out --arch "armeabi-v7a" "arm64-v8a" 
--extra-gn-args='rtc_use_h264=true ffmpeg_branding = "Chrome"'

2、修改src/third_party/ffmpeg/ffmpeg_generated.gni文件

该文件定义了ffmpeg编译时候,相关源码的依赖,包括h264相关的源码文件的依赖。

全局搜索h264相关的依赖,共发现12处。分别修改这12处对系统判断的条件,添加对android的支持。修改后的结果分别如下,请自行对照源文件查看区别。

1)

if ((is_mac) || (is_win) || (use_linux_config) || (is_android)) {
  ffmpeg_c_sources += [
    "libavcodec/autorename_libavcodec_hpeldsp.c",
    "libavcodec/autorename_libavcodec_videodsp.c",
    "libavcodec/autorename_libavcodec_vp3dsp.c",
    "libavcodec/autorename_libavcodec_vp8dsp.c",
    "libavcodec/h264pred.c",
    "libavcodec/vp3.c",
    "libavcodec/vp3_parser.c",
    "libavcodec/vp56rac.c",
    "libavcodec/vp8.c",
    "libavcodec/vp8_parser.c",
  ]
}

2)

if ((is_mac && ffmpeg_branding == "Chrome") || (is_win && ffmpeg_branding == "Chrome") || (use_linux_config && ffmpeg_branding == "Chrome") || (use_linux_config && ffmpeg_branding == "ChromeOS")||(is_android)) {
  ffmpeg_c_sources += [
    "libavcodec/atsc_a53.c",
    "libavcodec/cabac.c",
    "libavcodec/h2645_parse.c",
    "libavcodec/h264_cabac.c",
    "libavcodec/h264_cavlc.c",
    "libavcodec/h264_direct.c",
    "libavcodec/h264_loopfilter.c",
    "libavcodec/h264_mb.c",
    "libavcodec/h264_parse.c",
    "libavcodec/h264_parser.c",
    "libavcodec/h264_picture.c",
    "libavcodec/h264_ps.c",
    "libavcodec/h264_refs.c",
    "libavcodec/h264_sei.c",
    "libavcodec/h264_slice.c",
    "libavcodec/h264chroma.c",
    "libavcodec/h264data.c",
    "libavcodec/h264dec.c",
    "libavcodec/h264dsp.c",
    "libavcodec/h264idct.c",
    "libavcodec/h264qpel.c",
    "libavcodec/startcode.c",
  ]
}

3)

if ((is_mac && current_cpu == "x64") || (is_win && current_cpu == "x64") || (is_win && current_cpu == "x86") || (use_linux_config && current_cpu == "x64") || (use_linux_config && current_cpu == "x86")|| (is_android && current_cpu == "x86")) {
  ffmpeg_c_sources += [
    "libavcodec/x86/autorename_libavcodec_x86_videodsp_init.c",
    "libavcodec/x86/h264_intrapred_init.c",
    "libavcodec/x86/hpeldsp_init.c",
    "libavcodec/x86/hpeldsp_vp3_init.c",
    "libavcodec/x86/vp3dsp_init.c",
    "libavcodec/x86/vp8dsp_init.c",
  ]
  ffmpeg_asm_sources += [
    "libavcodec/x86/autorename_libavcodec_x86_videodsp.asm",
    "libavcodec/x86/fpel.asm",
    "libavcodec/x86/h264_intrapred.asm",
    "libavcodec/x86/h264_intrapred_10bit.asm",
    "libavcodec/x86/hpeldsp.asm",
    "libavcodec/x86/hpeldsp_vp3.asm",
    "libavcodec/x86/vp3dsp.asm",
    "libavcodec/x86/vp8dsp.asm",
    "libavcodec/x86/vp8dsp_loopfilter.asm",
  ]
}

4)

if ((use_linux_config && current_cpu == "arm" && arm_use_neon) || (use_linux_config && current_cpu == "arm")||(is_android && current_cpu == "arm" && arm_use_neon)||(is_android && current_cpu == "arm")) {
  ffmpeg_c_sources += [
    "libavcodec/arm/h264pred_init_arm.c",
    "libavcodec/arm/hpeldsp_init_arm.c",
    "libavcodec/arm/hpeldsp_init_armv6.c",
    "libavcodec/arm/videodsp_init_arm.c",
    "libavcodec/arm/videodsp_init_armv5te.c",
    "libavcodec/arm/vp3dsp_init_arm.c",
    "libavcodec/arm/vp8dsp_init_arm.c",
    "libavcodec/arm/vp8dsp_init_armv6.c",
  ]
  ffmpeg_gas_sources += [
    "libavcodec/arm/hpeldsp_arm.S",
    "libavcodec/arm/hpeldsp_armv6.S",
    "libavcodec/arm/videodsp_armv5te.S",
    "libavcodec/arm/vp8_armv6.S",
    "libavcodec/arm/vp8dsp_armv6.S",
  ]
}

5)

if ((use_linux_config && current_cpu == "mips64el") || (use_linux_config && current_cpu == "mipsel")||(is_android && current_cpu == "mips64el") ||(is_android && current_cpu == "mipsel")) {
  ffmpeg_c_sources += [
    "libavcodec/mips/autorename_libavcodec_mips_videodsp_init.c",
    "libavcodec/mips/h264pred_init_mips.c",
    "libavcodec/mips/hpeldsp_init_mips.c",
    "libavcodec/mips/vp3dsp_init_mips.c",
    "libavcodec/mips/vp8dsp_init_mips.c",
    "libavutil/mips/cpu.c",
    "libavutil/mips/float_dsp_mips.c",
  ]
}

6)

if ((is_mac && current_cpu == "arm64") || (is_win && current_cpu == "arm64") || (use_linux_config && current_cpu == "arm64")|| (is_android && current_cpu == "arm64")) {
  ffmpeg_c_sources += [
    "libavcodec/aarch64/h264pred_init.c",
    "libavcodec/aarch64/hpeldsp_init_aarch64.c",
    "libavcodec/aarch64/videodsp_init.c",
    "libavcodec/aarch64/vp8dsp_init_aarch64.c",
  ]
  ffmpeg_gas_sources += [
    "libavcodec/aarch64/autorename_libavcodec_aarch64_h264pred_neon.S",
    "libavcodec/aarch64/autorename_libavcodec_aarch64_hpeldsp_neon.S",
    "libavcodec/aarch64/autorename_libavcodec_aarch64_vp8dsp_neon.S",
    "libavcodec/aarch64/videodsp.S",
  ]
}

7)

if ((is_mac && current_cpu == "arm64" && ffmpeg_branding == "Chrome") || (is_win && current_cpu == "arm64" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "arm64" && ffmpeg_branding == "Chrome") || || (is_android && current_cpu == "arm64" && ffmpeg_branding == "Chrome")||(use_linux_config && current_cpu == "arm64" && ffmpeg_branding == "ChromeOS")) {
  ffmpeg_c_sources += [
    "libavcodec/aarch64/h264chroma_init_aarch64.c",
    "libavcodec/aarch64/h264dsp_init_aarch64.c",
    "libavcodec/aarch64/h264qpel_init_aarch64.c",
  ]
  ffmpeg_gas_sources += [
    "libavcodec/aarch64/autorename_libavcodec_aarch64_h264cmc_neon.S",
    "libavcodec/aarch64/autorename_libavcodec_aarch64_h264dsp_neon.S",
    "libavcodec/aarch64/autorename_libavcodec_aarch64_h264idct_neon.S",
    "libavcodec/aarch64/autorename_libavcodec_aarch64_h264qpel_neon.S",
  ]
}

8)

if ((use_linux_config && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "ChromeOS") || (use_linux_config && current_cpu == "arm" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "arm" && ffmpeg_branding == "ChromeOS")||(is_android && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "ChromeOS") || (is_android && current_cpu == "arm" && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "arm" && ffmpeg_branding == "ChromeOS")) {
  ffmpeg_c_sources += [
    "libavcodec/arm/h264chroma_init_arm.c",
    "libavcodec/arm/h264dsp_init_arm.c",
    "libavcodec/arm/h264qpel_init_arm.c",
  ]
  ffmpeg_gas_sources += [
    "libavcodec/arm/startcode_armv6.S",
  ]
}

9)

if ((is_mac && current_cpu == "x64" && ffmpeg_branding == "Chrome") || (is_win && current_cpu == "x64" && ffmpeg_branding == "Chrome") || (is_win && current_cpu == "x86" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "x64" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "x64" && ffmpeg_branding == "ChromeOS") || (use_linux_config && current_cpu == "x86" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "x86" && ffmpeg_branding == "ChromeOS") || (is_android && current_cpu == "x64" && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "x64" && ffmpeg_branding == "ChromeOS") || (is_android && current_cpu == "x86" && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "x86" && ffmpeg_branding == "ChromeOS")) {
  ffmpeg_c_sources += [
    "libavcodec/x86/h264_qpel.c",
    "libavcodec/x86/h264chroma_init.c",
    "libavcodec/x86/h264dsp_init.c",
  ]
  ffmpeg_asm_sources += [
    "libavcodec/x86/h264_chromamc.asm",
    "libavcodec/x86/h264_chromamc_10bit.asm",
    "libavcodec/x86/h264_deblock.asm",
    "libavcodec/x86/h264_deblock_10bit.asm",
    "libavcodec/x86/h264_idct.asm",
    "libavcodec/x86/h264_idct_10bit.asm",
    "libavcodec/x86/h264_qpel_10bit.asm",
    "libavcodec/x86/h264_qpel_8bit.asm",
    "libavcodec/x86/h264_weight.asm",
    "libavcodec/x86/h264_weight_10bit.asm",
    "libavcodec/x86/qpel.asm",
  ]
}

10)

if ((use_linux_config && current_cpu == "mips64el" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "mips64el" && ffmpeg_branding == "ChromeOS") || (use_linux_config && current_cpu == "mipsel" && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "mipsel" && ffmpeg_branding == "ChromeOS")||(is_android && current_cpu == "mips64el" && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "mips64el" && ffmpeg_branding == "ChromeOS") || (is_android && current_cpu == "mipsel" && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "mipsel" && ffmpeg_branding == "ChromeOS")) {
  ffmpeg_c_sources += [
    "libavcodec/mips/aacdec_mips.c",
    "libavcodec/mips/aacpsdsp_mips.c",
    "libavcodec/mips/aacsbr_mips.c",
    "libavcodec/mips/h264chroma_init_mips.c",
    "libavcodec/mips/h264dsp_init_mips.c",
    "libavcodec/mips/h264qpel_init_mips.c",
    "libavcodec/mips/sbrdsp_mips.c",
  ]
}

11)

if ((use_linux_config && current_cpu == "arm" && arm_use_neon) ||(is_android && current_cpu == "arm" && arm_use_neon)){
  ffmpeg_c_sources += [
    "libavcodec/arm/hpeldsp_init_neon.c",
    "libavcodec/arm/vp8dsp_init_neon.c",
  ]
  ffmpeg_gas_sources += [
    "libavcodec/arm/h264pred_neon.S",
    "libavcodec/arm/hpeldsp_neon.S",
    "libavcodec/arm/vp3dsp_neon.S",
    "libavcodec/arm/vp8dsp_neon.S",
  ]
}

12)

if ((use_linux_config && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "Chrome") || (use_linux_config && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "ChromeOS")||(is_android && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "Chrome") || (is_android && current_cpu == "arm" && arm_use_neon && ffmpeg_branding == "ChromeOS")) {
  ffmpeg_gas_sources += [
    "libavcodec/arm/h264cmc_neon.S",
    "libavcodec/arm/h264dsp_neon.S",
    "libavcodec/arm/h264idct_neon.S",
    "libavcodec/arm/h264qpel_neon.S",
  ]
}

3、静态注册ffmpge解码相关的parser和decoder

在src/third_party/ffmpeg/chromium/config/Chrome/android/{ABI}/libavcodec/codec_list.c和
src/third_party/ffmpeg/chromium/config/Chrome/android/{ABI}/libavcodec/parser_list.c中分别添加&ff_h264_decoder,和&ff_h264_parser。

static const AVCodec * const codec_list[] = {
    &ff_aac_decoder,
    &ff_flac_decoder,
    &ff_mp3_decoder,
    &ff_vorbis_decoder,
    &ff_pcm_alaw_decoder,
    &ff_pcm_f32le_decoder,
    &ff_pcm_mulaw_decoder,
    &ff_pcm_s16be_decoder,
    &ff_pcm_s16le_decoder,
    &ff_pcm_s24be_decoder,
    &ff_pcm_s24le_decoder,
    &ff_pcm_s32le_decoder,
    &ff_pcm_u8_decoder,
    &ff_libopus_decoder,
    &ff_h264_decoder,
    NULL };
static const AVCodecParser * const parser_list[] = {
    &ff_aac_parser,
    &ff_flac_parser,
    &ff_mpegaudio_parser,
    &ff_opus_parser,
    &ff_vorbis_parser,
    &ff_vp9_parser,
    &ff_h264_parser,
    NULL };

4、修改宏定义

打开文件src/third_party/ffmpeg/chromium/config/Chrome/android/{ABI}/config.h,搜索CONFIG_H264_DECODER,将该值改成1

5、添加licenses

打开src/tools_webrtc/libs/generate_licences,找到LIB_TO_LICENSES_DICT节点,添加

‘openh264’:[‘third_party/openh264/src/LICENSE’],
‘ffmpeg’:[‘third_party/ffmpeg/LICENSE.md’],

6、实现jni和java层

以上操作完毕之后,已经将h264编解码器集成到WebRTC当中了,接下来模仿vp8或者vp9实现jni和java层
进入src/sdk/android/src/jni文件夹,创建h264_codec.cc文件

1)、jni层

#include 

#include "modules/video_coding/codecs/h264/include/h264.h"
#include "sdk/android/generated_h264_jni/H264Decoder_jni.h"
#include "sdk/android/generated_h264_jni/H264Encoder_jni.h"
#include "sdk/android/src/jni/jni_helpers.h"

namespace webrtc {
namespace jni {

static jlong JNI_H264Encoder_CreateEncoder(JNIEnv* jni) {
  return jlongFromPointer(H264Encoder::Create().release());
}

static jboolean JNI_H264Encoder_IsSupported(JNIEnv* jni) {
  return !SupportedH264Codecs().empty();
}

static jlong JNI_H264Decoder_CreateDecoder(JNIEnv* jni) {
  return jlongFromPointer(H264Decoder::Create().release());
}

static jboolean JNI_H264Decoder_IsSupported(JNIEnv* jni) {
  return !SupportedH264Codecs().empty();
}

}  // namespace jni
}  // namespace webrtc

2)、javac层

进入src/sdk/android/api/org/webrtc文件夹,模仿LibvpxVp9Decoder.java和LibvpxVp9Encoder.java编写H264Decoder.java和H264Encoder.java类,如下:

package org.webrtc;

public class H264Decoder extends WrappedNativeVideoDecoder {
  @Override
  public long createNativeVideoDecoder() {
    return nativeCreateDecoder();
  }

  static native long nativeCreateDecoder();

  static native boolean nativeIsSupported();
}

 

package org.webrtc;

public class H264Encoder extends WrappedNativeVideoEncoder {
  @Override
  public long createNativeVideoEncoder() {
    return nativeCreateEncoder();
  }

  static native long nativeCreateEncoder();

  @Override
  public boolean isHardwareEncoder() {
    return false;
  }

  static native boolean nativeIsSupported();
}

3)、关于H264Encoder,还要修改源码层的画圈的两个文件

WebRTC android 端支持H264编解码_第3张图片

分别添加不带参数的Create方法和实现

  static std::unique_ptr Create();
std::unique_ptr H264Encoder::Create() {
#if defined(WEBRTC_USE_H264)
  return std::make_unique(cricket::VideoCodec());
#else
  RTC_NOTREACHED();
  return nullptr;
#endif
}

 

7、修改软解软编工厂类

打开src/sdk/android/api/org/webrtc,找到SoftwareVideoDecoderFactory类,在createDecoder方法中添加

 if (codecType.getName().equalsIgnoreCase("H264") && H264Decoder.nativeIsSupported()) {
      return new H264Decoder();
    }

 

在supportedCodecs方法中添加

if (H264Decoder.nativeIsSupported()) {
      codecs.add(new VideoCodecInfo("H264", new HashMap<>()));
}

打开SoftwareVideoEncoderFactory类,同理在createEncoder和supportedCodecs方法中添加代码段,如下:

if (info.name.equalsIgnoreCase("H264") && H264Encoder.nativeIsSupported()) {
     
      return new H264Encoder();
}

if (H264Encoder.nativeIsSupported()) {
      codecs.add(new VideoCodecInfo("H264", new HashMap<>()));
}

8、添加H264编译脚本

打开src/sdk/android/BUILD.gn文件,搜索vp9,模仿vp9写出h264的编译脚本,如下

 dist_jar("libwebrtc") {
    _target_dir_name = get_label_info(":$target_name", "dir")
    output = "${root_out_dir}/lib.java${_target_dir_name}/${target_name}.jar"
    direct_deps_only = true
    use_unprocessed_jars = true
    requires_android = true
    no_build_hooks = true

    deps = [
      ":audio_api_java",
      ":base_java",
      ":builtin_audio_codecs_java",
      ":camera_java",
      ":default_video_codec_factory_java",
      ":filevideo_java",
      ":hwcodecs_java",
      ":java_audio_device_module_java",
      ":libjingle_peerconnection_java",
      ":libjingle_peerconnection_metrics_default_java",
      ":libvpx_vp8_java",
      ":libvpx_vp9_java",
      ":h264_java",//添加
      ":logging_java",
      ":peerconnection_java",
      ":screencapturer_java",
      ":surfaceviewrenderer_java",
      ":swcodecs_java",
      ":video_api_java",
      ":video_java",
      "../../modules/audio_device:audio_device_java",
      "../../rtc_base:base_java",
    ]
  }
rtc_android_library("h264_java") {
      visibility = [ "*" ]
      sources = [
        "api/org/webrtc/H264Decoder.java",
        "api/org/webrtc/H264Encoder.java",
      ]
      deps = [
        ":base_java",
        ":video_api_java",
        ":video_java",
        "//rtc_base:base_java",
      ]
    }
 rtc_android_library("swcodecs_java") {
    visibility = [ "*" ]
    sources = [
      "api/org/webrtc/SoftwareVideoDecoderFactory.java",
      "api/org/webrtc/SoftwareVideoEncoderFactory.java",
    ]

    deps = [
      ":base_java",
      ":libvpx_vp8_java",
      ":libvpx_vp9_java",
      ":h264_java",
      ":video_api_java",
      ":video_java",
      "//rtc_base:base_java",
      "//third_party/android_deps:com_android_support_support_annotations_java",
    ]
  }
}
 rtc_library("h264_jni") {
    visibility = [ "*" ]
    allow_poison = [ "software_video_codecs" ]
    sources = [ "src/jni/h264_codec.cc" ]
    deps = [
      ":base_jni",
      ":generated_h264_jni",
      ":video_jni",
      "../../modules/video_coding:webrtc_h264",
    ]
  }
  rtc_library("swcodecs_jni") {
    visibility = [ "*" ]
    allow_poison = [ "software_video_codecs" ]
    deps = [
      ":libvpx_vp8_jni",
      ":libvpx_vp9_jni",
      ":h264_jni",
    ]
  }

以上的代码段,都是GN相关语法,具体请含义请自行查找。下面的是代码段是用来生成 H264Decoder.java,H264Encoder.java对应的JNI文件的,最终会被打包到so库中。

 generate_jni("generated_h264_jni") {
      sources = [
        "api/org/webrtc/H264Decoder.java",
        "api/org/webrtc/H264Encoder.java",
      ]

      namespace = "webrtc::jni"
      jni_generator_include = "//sdk/android/src/jni/jni_generator_helper.h"
    }

9、修改完以上配置后和源码后,即可进行编译

 
tools_webrtc/android/build_aar.py --build-dir out --arch "armeabi-v7a" "arm64-v8a" --extra-gn-args='rtc_use_h264=true ffmpeg_branding = "Chrome"'

 

 

 

你可能感兴趣的:(webrtc,webrtc)