为什么美颜插件比传统集成方式更快?

都说云市场插件快,快在哪里呢?

美颜功能是实时互动应用的基础功能,广泛应用在社交、直播、约会、会议等场景,开发者们往往在集成美颜功能时候非常头疼。今天,我们来介绍云市场美颜插件相比传统的裸数据集成方式快在哪里。

上手快

每一个插件都有属于自己的示例代码和使用文档,集成插件之前先跑通示例代码,能够帮助开发者们快速看到美颜效果,并快速熟悉插件集成到自己的项目中需要哪些工作。

下载插件、下载资源包、填写 appId、配置好测试证书。

点点gradle、点点run app,全部搞定。

为什么美颜插件比传统集成方式更快?_第1张图片

为什么美颜插件比传统集成方式更快?_第2张图片

集成快

我们以集成相芯美颜为例:

传统的裸数据集成:

  • 首先需要将大段的 Camera 代码加入相关工程中

  • 从 Camera 回调接口取出每一帧图像数据

  • 传给相芯进行图像处理

  • 把处理完的数据传回声网的 video pipeline 中,进行编码并发送

这样项目中需要集成大量代码,而且需要开发者需要大量音视频知识和实践经验,比如:相芯需要处理何种格式的视频数据、Android 的 texture 处理等等。集成完成后,还要调试音视频体验相关问题,比如:回音、啸叫、音画不同步、黑屏、卡顿、crash、性能、机型适配等等。

今天,声网作为实时互动领域的领军者,我们把集成的最佳工程实践提供了出来,云市场上的相芯美颜插件把以上这 1000 件事情都解决了,大大降低了时间精力,3 个 API 就可以简洁地完成集成:

  • addExtension()
  • enableExtension()
  • setExtensionProperty()

完成后就能在声网视频流中轻松使用到插件效果,还可以再接入其他更多扩展能力,比如:实时变声、实时翻译、内容审核等等。

这样极大减少了实时互动开发者的工作量,整体的集成代码的管理和后续维护会变得非常干净简洁,并且节约了大量测试、调试的时间,让开发者把精力放在业务逻辑和用户体验上。

还是没概念?不如直接看看相芯美颜的集成案例吧:

第一个为裸数据集成方案,第二个为相芯美颜插件集成方案:

为什么美颜插件比传统集成方式更快?_第3张图片
为什么美颜插件比传统集成方式更快?_第4张图片
为什么美颜插件比传统集成方式更快?_第5张图片

    private void initVideoView() {
        mBinding.cbFaceBeautify.setOnCheckedChangeListener((buttonView, isChecked) -> {
            if (iBeautyFaceUnity == null) {
                return;
            }
            iBeautyFaceUnity.setFaceBeautifyEnable(isChecked);
        });
        mBinding.cbMakeup.setOnCheckedChangeListener((buttonView, isChecked) -> {
            if (iBeautyFaceUnity == null) {
                return;
            }
            iBeautyFaceUnity.setMakeUpEnable(isChecked);
        });
        mBinding.cbSticker.setOnCheckedChangeListener((buttonView, isChecked) -> {
            if (iBeautyFaceUnity == null) {
                return;
            }
            iBeautyFaceUnity.setStickerEnable(isChecked);
        });
        mBinding.cbBodyBeauty.setOnCheckedChangeListener((buttonView, isChecked) -> {
            if (iBeautyFaceUnity == null) {
                return;
            }
            iBeautyFaceUnity.setBodyBeautifyEnable(isChecked);
        });
        mBinding.ivCamera.setOnClickListener(v -> {
            rtcEngine.switchCamera();
            isFrontCamera = !isFrontCamera;
        });
        mBinding.tvBeautyInput.setText(isSingleInput ? R.string.beauty_input_single : R.string.beauty_input_double);
        mBinding.tvBeautyInput.setOnClickListener(v -> {
            isSingleInput = !isSingleInput;
            mBinding.tvBeautyInput.setText(isSingleInput ? R.string.beauty_input_single : R.string.beauty_input_double);
        });
        mBinding.smallVideoContainer.setOnClickListener(v -> updateVideoLayouts(!FaceUnityBeauty.this.isLocalFull));
    }

    private void initRtcEngine() {
        try {
            mRtcEngineEventHandler = new IRtcEngineEventHandler() {
                @Override
                public void onError(int err) {
                    super.onError(err);
                    showLongToast(String.format(Locale.US, "msg:%s, code:%d", RtcEngine.getErrorDescription(err), err));
                }

                @Override
                public void onJoinChannelSuccess(String channel, int uid, int elapsed) {
                    super.onJoinChannelSuccess(channel, uid, elapsed);
                    mLocalVideoLayout.setReportUid(uid);
                }

                @Override
                public void onUserJoined(int uid, int elapsed) {
                    super.onUserJoined(uid, elapsed);
                    runOnUIThread(() -> {
                        if (mRemoteVideoLayout == null) {
                            mRemoteVideoLayout = new VideoReportLayout(requireContext());
                            mRemoteVideoLayout.setReportUid(uid);
                            TextureView videoView = new TextureView(requireContext());
                            rtcEngine.setupRemoteVideo(new VideoCanvas(videoView, Constants.RENDER_MODE_HIDDEN, uid));
                            mRemoteVideoLayout.addView(videoView);
                            updateVideoLayouts(isLocalFull);
                        }
                    });
                }

                @Override
                public void onUserOffline(int uid, int reason) {
                    super.onUserOffline(uid, reason);
                    runOnUIThread(() -> {
                        if (mRemoteVideoLayout != null && mRemoteVideoLayout.getReportUid() == uid) {
                            mRemoteVideoLayout.removeAllViews();
                            mRemoteVideoLayout = null;
                            updateVideoLayouts(isLocalFull);
                        }
                    });
                }

                @Override
                public void onLocalAudioStats(LocalAudioStats stats) {
                    super.onLocalAudioStats(stats);
                    runOnUIThread(() -> mLocalVideoLayout.setLocalAudioStats(stats));
                }

                @Override
                public void onLocalVideoStats(Constants.VideoSourceType source, LocalVideoStats stats) {
                    super.onLocalVideoStats(source, stats);
                    runOnUIThread(() -> mLocalVideoLayout.setLocalVideoStats(stats));
                }

                @Override
                public void onRemoteAudioStats(RemoteAudioStats stats) {
                    super.onRemoteAudioStats(stats);
                    if (mRemoteVideoLayout != null) {
                        runOnUIThread(() -> mRemoteVideoLayout.setRemoteAudioStats(stats));
                    }
                }

                @Override
                public void onRemoteVideoStats(RemoteVideoStats stats) {
                    super.onRemoteVideoStats(stats);
                    if (mRemoteVideoLayout != null) {
                        runOnUIThread(() -> mRemoteVideoLayout.setRemoteVideoStats(stats));
                    }
                }
            };
            rtcEngine = RtcEngine.create(getContext(), getString(R.string.agora_app_id), mRtcEngineEventHandler);

            if (rtcEngine == null) {
                return;
            }


            mVideoFrameObserver = new IVideoFrameObserver() {
                @Override
                public boolean onCaptureVideoFrame(VideoFrame videoFrame) {
                    if (isDestroyed) {
                        return true;
                    }
                    VideoFrame.Buffer buffer = videoFrame.getBuffer();
                    if (!(buffer instanceof VideoFrame.TextureBuffer)) {
                        return true;
                    }

                    VideoFrame.TextureBuffer texBuffer = (VideoFrame.TextureBuffer) buffer;

                    if (mTextureBufferHelper == null) {
                        doOnBeautyCreatingBegin();
                        mTextureBufferHelper = TextureBufferHelper.create("STRender", texBuffer.getEglBaseContext());
                        mTextureBufferHelper.invoke(() -> {
                            iBeautyFaceUnity = IBeautyFaceUnity.create(getContext());
                            return null;
                        });
                        doOnBeautyCreatingEnd();
                    }

                    VideoFrame.TextureBuffer processBuffer;
                    if (isSingleInput) {
                        processBuffer = processSingleInput(texBuffer);
                    } else {
                        processBuffer = processDoubleInput(texBuffer);
                    }
                    if(processBuffer == null){
                        return true;
                    }
                    // drag one frame to avoid reframe when switching camera.
                    if(mFrameRotation != videoFrame.getRotation()){
                        mFrameRotation = videoFrame.getRotation();
                        return false;
                    }
                    videoFrame.replaceBuffer(processBuffer, mFrameRotation, videoFrame.getTimestampNs());
                    return true;
                }

                @Override
                public boolean onPreEncodeVideoFrame(VideoFrame videoFrame) {
                    return false;
                }

                @Override
                public boolean onScreenCaptureVideoFrame(VideoFrame videoFrame) {
                    return false;
                }

                @Override
                public boolean onPreEncodeScreenVideoFrame(VideoFrame videoFrame) {
                    return false;
                }

                @Override
                public boolean onMediaPlayerVideoFrame(VideoFrame videoFrame, int mediaPlayerId) {
                    return false;
                }

                @Override
                public boolean onRenderVideoFrame(String channelId, int uid, VideoFrame videoFrame) {
                    return false;
                }

                @Override
                public int getVideoFrameProcessMode() {
                    return IVideoFrameObserver.PROCESS_MODE_READ_WRITE;
                }

                @Override
                public int getVideoFormatPreference() {
                    return IVideoFrameObserver.VIDEO_PIXEL_DEFAULT;
                }

                @Override
                public boolean getRotationApplied() {
                    return false;
                }

                @Override
                public boolean getMirrorApplied() {
                    return false;
                }

                @Override
                public int getObservedFramePosition() {
                    return IVideoFrameObserver.POSITION_POST_CAPTURER;
                }
            };
            rtcEngine.registerVideoFrameObserver(mVideoFrameObserver);
            rtcEngine.enableVideo();
            rtcEngine.disableAudio();

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    private VideoFrame.TextureBuffer processSingleInput(VideoFrame.TextureBuffer texBuffer) {

        int width = texBuffer.getWidth();
        int height = texBuffer.getHeight();

        Integer processTexId = mTextureBufferHelper.invoke(() -> iBeautyFaceUnity.process(
                texBuffer.getTextureId(),
                width, height
        ));

        return mTextureBufferHelper.wrapTextureBuffer(
                width, height, VideoFrame.TextureBuffer.Type.RGB, processTexId,
                texBuffer.getTransformMatrix());
    }

    private VideoFrame.TextureBuffer processDoubleInput(VideoFrame.TextureBuffer texBuffer) {

        int textureId = texBuffer.getTextureId();
        int width = texBuffer.getWidth();
        int height = texBuffer.getHeight();

        int nv21Size = (int) (width * height * 3.0f / 2.0f + 0.5f);
        if (nv21ByteBuffer == null || nv21ByteBuffer.capacity() != nv21Size) {
            if (nv21ByteBuffer != null) {
                nv21ByteBuffer.clear();
            }
            nv21ByteBuffer = ByteBuffer.allocateDirect(nv21Size);
            nv21ByteArray = new byte[nv21Size];
        }


        VideoFrame.I420Buffer i420Buffer = texBuffer.toI420();
        YuvHelper.I420ToNV12(i420Buffer.getDataY(), i420Buffer.getStrideY(),
                i420Buffer.getDataV(), i420Buffer.getStrideV(),
                i420Buffer.getDataU(), i420Buffer.getStrideU(),
                nv21ByteBuffer, width, height);
        nv21ByteBuffer.position(0);
        nv21ByteBuffer.get(nv21ByteArray);
        i420Buffer.release();

        Integer processTexId = mTextureBufferHelper.invoke(() -> iBeautyFaceUnity.process(
                nv21ByteArray,
                textureId,
                width, height
        ));

        return mTextureBufferHelper.wrapTextureBuffer(
                width, height, VideoFrame.TextureBuffer.Type.RGB, processTexId, texBuffer.getTransformMatrix());
    }

    private void joinChannel() {
        int uid = new Random(System.currentTimeMillis()).nextInt(1000) + 10000;
        ChannelMediaOptions options = new ChannelMediaOptions();
        options.channelProfile = Constants.CHANNEL_PROFILE_LIVE_BROADCASTING;
        options.clientRoleType = Constants.CLIENT_ROLE_BROADCASTER;
        TokenUtils.gen(requireActivity(), channelId, uid, token -> {
            int ret = rtcEngine.joinChannel(token, channelId, uid, options);
            if (ret != Constants.ERR_OK) {
                showAlert(String.format(Locale.US, "%s\ncode:%d", RtcEngine.getErrorDescription(ret), ret));
            }
        });

        mLocalVideoLayout = new VideoReportLayout(requireContext());
        TextureView videoView = new TextureView(requireContext());
        rtcEngine.setupLocalVideo(new VideoCanvas(videoView, Constants.RENDER_MODE_HIDDEN));
        mLocalVideoLayout.addView(videoView);
        rtcEngine.startPreview();

        updateVideoLayouts(isLocalFull);
    }

    private void updateVideoLayouts(boolean isLocalFull) {
        this.isLocalFull = isLocalFull;
        mBinding.fullVideoContainer.removeAllViews();
        mBinding.smallVideoContainer.removeAllViews();
        if (isLocalFull) {
            if (mLocalVideoLayout != null) {
                mBinding.fullVideoContainer.addView(mLocalVideoLayout);
            }

            if (mRemoteVideoLayout != null) {
                mRemoteVideoLayout.getChildAt(0).setOnClickListener(v -> updateVideoLayouts(!FaceUnityBeauty.this.isLocalFull));
                mBinding.smallVideoContainer.addView(mRemoteVideoLayout);
            }
        } else {
            if (mLocalVideoLayout != null) {
                mLocalVideoLayout.getChildAt(0).setOnClickListener(v -> updateVideoLayouts(!FaceUnityBeauty.this.isLocalFull));
                mBinding.smallVideoContainer.addView(mLocalVideoLayout);
            }
            if (mRemoteVideoLayout != null) {
                mBinding.fullVideoContainer.addView(mRemoteVideoLayout);
            }
        }
    }

    private void doOnBeautyCreatingBegin() {
        Log.d(TAG, "doOnBeautyCreatingBegin...");
    }

    private void doOnBeautyCreatingEnd() {
        Log.d(TAG, "doOnBeautyCreatingEnd.");
        runOnUIThread(() -> {
            mBinding.cbBodyBeauty.setChecked(false);
            mBinding.cbFaceBeautify.setChecked(false);
            mBinding.cbSticker.setChecked(false);
            mBinding.cbMakeup.setChecked(false);
        });
    }

    private void doOnBeautyReleasingBegin() {
        Log.d(TAG, "doOnBeautyReleasingBegin...");
    }

    private void doOnBeautyReleasingEnd() {
        Log.d(TAG, "doOnBeautyReleasingEnd.");
    }
}

为什么美颜插件比传统集成方式更快?_第6张图片
为什么美颜插件比传统集成方式更快?_第7张图片

切换快

当产品和研发在技术选型、测试集成和后续运维中,经常会遇到当前供应商不满足业务要求,要换供应商的情况。但是又需要花时间去调研、阅读技术文档、沟通对接、议价等流程,非常繁琐。我们都为你考虑到了,我们的云市场插件在 wrapper 层做了标准化的封装,让您轻松实现同类型供应商、同客户端平台切换。

为什么美颜插件比传统集成方式更快?_第8张图片
每一款插件都对接口进行了标准化的封装处理,即使切换了其他厂家,调用插件依旧是3个步骤:

  • addExtensions()
  • enableExtensions()
  • setExtensionProperty()

当然了,每一个插件都有自己的说明文档,任何细节都不放过。

为什么美颜插件比传统集成方式更快?_第9张图片

为什么美颜插件比传统集成方式更快?_第10张图片

为什么美颜插件比传统集成方式更快?_第11张图片

服务快

集成遇到问题?

没关系,声网服务线会出手!

我们有专业的服务线来解决您在集成插件时遇到的问题。任何疑难杂症,我们会和伙伴联合服务,24 * 7 为你保驾护航。

体验过的都说好。

你可能感兴趣的:(video,audio)