webrtc-m79-视频相关的VCMTiming 中几个 delay 的更新

1 参考链接

https://blog.csdn.net/lipku/article/details/104124569

https://blog.csdn.net/sonysuqin/article/details/107297157

https://blog.csdn.net/sonysuqin/article/details/106629343

https://www.jianshu.com/p/0bc6a4998b32

多谢各位的付出。

2 代码分析

2.1 delay更新入口:

void FrameBuffer::StartWaitForNextFrameOnQueue() {
  RTC_DCHECK(callback_queue_);
  RTC_DCHECK(!callback_task_.Running());
  int64_t wait_ms = FindNextFrame(clock_->TimeInMilliseconds()); //
  callback_task_ = RepeatingTaskHandle::DelayedStart( //
      callback_queue_->Get(), TimeDelta::ms(wait_ms), [this] {
        // If this task has not been cancelled, we did not get any new frames
        // while waiting. Continue with frame delivery.
        rtc::CritScope lock(&crit_);
        if (!frames_to_decode_.empty()) {
          // We have frames, deliver!
          frame_handler_(absl::WrapUnique(GetNextFrame()), kFrameFound); //
          CancelCallback();
          return TimeDelta::Zero();  // Ignored.
        } else if (clock_->TimeInMilliseconds() >= latest_return_time_ms_) {
          // We have timed out, signal this and stop repeating.
          frame_handler_(nullptr, kTimeout); //
          CancelCallback();
          return TimeDelta::Zero();  // Ignored.
        } else {
          // If there's no frames to decode and there is still time left, it
          // means that the frame buffer was cleared between creation and
          // execution of this task. Continue waiting for the remaining time.
          int64_t wait_ms = FindNextFrame(clock_->TimeInMilliseconds()); //
          return TimeDelta::ms(wait_ms);
        }
      });
}

2.2 webrtc::VCMEncodedFrame 中 _renderTimeMs 的更新

// FindNextFrame函数在查找待解码帧的时候会通过VCMTiming模块获取期望渲染时间。

// 设置视频帧的期望渲染时间
int64_t FrameBuffer::FindNextFrame(int64_t now_ms)
===>
// 默认该函数调用到这里的时候期望渲染时间都还未赋值的
    if (frame->RenderTime() == -1) {
     // 首先调用 VCMTiming 获取期望渲染时间,然后将其设置到Frame中,供后续使用
      frame->SetRenderTime(timing_->RenderTimeMs(frame->Timestamp(), now_ms));
    }
    // 得出最大等待时间
    wait_ms = timing_->MaxWaitingTime(frame->RenderTime(), now_ms);
    // 取最小时间,如果在一次调度时间(未超时范围内)的话,返回 wait_ms
    wait_ms = std::min(wait_ms, latest_return_time_ms_ - now_ms);

2.3 VCMTiming 中 jitter_delay_ms_ 的更新

EncodedFrame* FrameBuffer::GetNextFrame() {
  int64_t now_ms = clock_->TimeInMilliseconds();
  // TODO(ilnik): remove |frames_out| use frames_to_decode_ directly.
  std::vector frames_out;

  RTC_DCHECK(!frames_to_decode_.empty());
  bool superframe_delayed_by_retransmission = false;
  size_t superframe_size = 0;
  EncodedFrame* first_frame = frames_to_decode_[0]->second.frame.get();
  int64_t render_time_ms = first_frame->RenderTime();
  int64_t receive_time_ms = first_frame->ReceivedTime();
  // Gracefully handle bad RTP timestamps and render time issues.
  if (HasBadRenderTiming(*first_frame, now_ms)) {
    jitter_estimator_.Reset();
    timing_->Reset();
    render_time_ms = timing_->RenderTimeMs(first_frame->Timestamp(), now_ms);
  }

  for (FrameMap::iterator& frame_it : frames_to_decode_) {
    RTC_DCHECK(frame_it != frames_.end());
    EncodedFrame* frame = frame_it->second.frame.release();

    frame->SetRenderTime(render_time_ms);

    superframe_delayed_by_retransmission |= frame->delayed_by_retransmission();
    receive_time_ms = std::max(receive_time_ms, frame->ReceivedTime());
    superframe_size += frame->size();

    PropagateDecodability(frame_it->second);
    decoded_frames_history_.InsertDecoded(frame_it->first, frame->Timestamp());

    // Remove decoded frame and all undecoded frames before it.
    if (stats_callback_) {
      unsigned int dropped_frames = std::count_if(
          frames_.begin(), frame_it,
          [](const std::pair& frame) {
            return frame.second.frame != nullptr;
          });
      if (dropped_frames > 0) {
        stats_callback_->OnDroppedFrames(dropped_frames);
      }
    }

    frames_.erase(frames_.begin(), ++frame_it);

    frames_out.push_back(frame);
  }

  if (!superframe_delayed_by_retransmission) {
    int64_t frame_delay;

		// 计算帧间延迟
    if (inter_frame_delay_.CalculateDelay(first_frame->Timestamp(),
                                          &frame_delay, receive_time_ms)) {
      // 卡尔曼滤波器计算抖动,输入观测帧间延迟,输出最优帧间延迟,也就是抖动
      jitter_estimator_.UpdateEstimate(frame_delay, superframe_size);
    }

    float rtt_mult = protection_mode_ == kProtectionNackFEC ? 0.0 : 1.0;
    absl::optional rtt_mult_add_cap_ms = absl::nullopt;
    if (rtt_mult_settings_.has_value()) {
      rtt_mult = rtt_mult_settings_->rtt_mult_setting;
      rtt_mult_add_cap_ms = rtt_mult_settings_->rtt_mult_add_cap_ms;
    }
    // 获取抖动,并设置到timing_中,如果是初始状态,当前延迟(googCurrentDelayMs)被设置成抖动
    timing_->SetJitterDelay(
        jitter_estimator_.GetJitterEstimate(rtt_mult, rtt_mult_add_cap_ms));
    // 更新当前延迟(googCurrentDelayMs),逼近googTargetDelayMs
    timing_->UpdateCurrentDelay(render_time_ms, now_ms);
  } else {
    if (RttMultExperiment::RttMultEnabled() || add_rtt_to_playout_delay_)
      jitter_estimator_.FrameNacked();
  }
  // 通知Observer
  UpdateJitterDelay();
  UpdateTimingFrameInfo();

  if (frames_out.size() == 1) {
    return frames_out[0];
  } else {
    return CombineAndDeleteFrames(frames_out);
  }
}

 

2.4  VCMTiming 中 current_delay_ms_ 的更新

FrameBuffer每获得一个可解码帧会调用一次,更新当前延迟,最终用于计算渲染时间。

//VCMTiming 中的 current_delay_ms_ 目的是向 target delay 进行逼近,逼近的步长是根据渲染延迟进行计算的。

void FrameBuffer::StartWaitForNextFrameOnQueue()
===>
frame_handler_(absl::WrapUnique(GetNextFrame()), kFrameFound)


EncodedFrame* FrameBuffer::GetNextFrame()
======>
timing_->UpdateCurrentDelay(render_time_ms, now_ms);


void VCMTiming::UpdateCurrentDelay(int64_t render_time_ms,
                                   int64_t actual_decode_time_ms) {
  rtc::CritScope cs(&crit_sect_);
  uint32_t target_delay_ms = TargetDelayInternal();
  int64_t delayed_ms =
      actual_decode_time_ms -
      (render_time_ms - RequiredDecodeTimeMs() - render_delay_ms_);
  if (delayed_ms < 0) {
    return;
  }
  if (current_delay_ms_ + delayed_ms <= target_delay_ms) {
    current_delay_ms_ += delayed_ms;
  } else {
    current_delay_ms_ = target_delay_ms;
  }
}

 

2.5 VCMTiming 中 ts_extrapolator_ 的使用

FrameBuffer每获得一个可解码帧,都要更新其渲染时间,渲染时间通过TimestampExtrapolator类获得。TimestampExtrapolator是一个卡尔曼滤波器,其输入为输入帧的RTP时间戳,TimestampExtrapolator会根据输入帧的RTP时间戳计算出该帧的期望接收时间,该时间是经过平滑的。

视频帧的期望渲染时间 = 帧平滑时间(就是帧的期望接收时间) + 实际延迟(actual delay,由current_delay_ms_、min_playout_delay_ms_和max_playout_delay_ms_计算出来)。

int64_t VCMTiming::RenderTimeMs(uint32_t frame_timestamp,
                                int64_t now_ms) const {
  rtc::CritScope cs(&crit_sect_);
  return RenderTimeMsInternal(frame_timestamp, now_ms);
}

int64_t VCMTiming::RenderTimeMsInternal(uint32_t frame_timestamp,
                                        int64_t now_ms) const {
  // 如果这两个播放延迟都是0,要求立刻渲染
  if (min_playout_delay_ms_ == 0 && max_playout_delay_ms_ == 0) {
    // Render as soon as possible.
    return 0;
  }
  //使用卡尔曼滤波器估算帧平滑时间.
  int64_t estimated_complete_time_ms =
      ts_extrapolator_->ExtrapolateLocalTime(frame_timestamp);
  if (estimated_complete_time_ms == -1) {
    estimated_complete_time_ms = now_ms;
  }

  // Make sure the actual delay stays in the range of |min_playout_delay_ms_|
  // and |max_playout_delay_ms_|.
  int actual_delay = std::max(current_delay_ms_, min_playout_delay_ms_);
  actual_delay = std::min(actual_delay, max_playout_delay_ms_);
  return estimated_complete_time_ms + actual_delay;
}

 

3 相关类图

webrtc-m79-视频相关的VCMTiming 中几个 delay 的更新_第1张图片

 

 

 

webrtc-m79-视频相关的VCMTiming 中几个 delay 的更新_第2张图片

 

webrtc-m79-视频相关的VCMTiming 中几个 delay 的更新_第3张图片

 

4 从上述链接中借鉴的图加的标注

webrtc-m79-视频相关的VCMTiming 中几个 delay 的更新_第4张图片

你可能感兴趣的:(webrtc)