【Apollo解读代码系列之perception】lidar模块tracker逻辑分析

lidar模块之tracker

  Apollo感知代码之激光雷达模块的追踪使用的是SORT算法架构。主要包括以下几个模块:关联模块association、通用模块common、测量值模块measurement、多激光雷达融合模块multi_lidar_fusion、地图分割semantic_map

lidar-tracker代码主要结构图

【Apollo解读代码系列之perception】lidar模块tracker逻辑分析_第1张图片
  其中MlfTrackData为追踪数据处理结构,继承TrackDataTrackedObject为输入的测量值结构,初始化都是base::ConcurrentObjectPool的序列容器存储。你可以理解TrackedObject就是输入的目标,MlfTrackData为追踪列表里面的一个目标物体,然后通过base::ConcurrentObjectPool来存储多个TrackedObjectMlfTrackData物体。

补:我们知道追踪列表维护一个数据结构就是上面的TrackData,测量值维护的数据结构是TrackObject。这里的MlfTrackData继承自TrackData主要是由于激光雷达有不同的品牌类型,所以会存在传感器类型的一个数据确认过程。具体细节请看代码部分吧,这里仅仅做个提示。

  上面为感知激光的追踪代码,MlfEngine继承接口类BaseMultiTargetTracker,其内部包含两个变量MlfTrackerMlfTrackObjectMatcherMlfTracker属于状态变量更新模块,MlfTrackObjectMatcher属于数据匹配模块,主要采取多种距离加权方式来设定关联权重,其中关联矩阵获取最终的匹配结果根据激光雷达障碍物是否为背景采取不同的方式。

void MlfTrackObjectMatcher::ComputeAssociateMatrix(
    const std::vector<MlfTrackDataPtr> &tracks,
    const std::vector<TrackedObjectPtr> &new_objects,
    common::SecureMat<float> *association_mat) {
  for (size_t i = 0; i < tracks.size(); ++i) {
    for (size_t j = 0; j < new_objects.size(); ++j) {
      // 计算关联的权重
      (*association_mat)(i, j) =
          track_object_distance_->ComputeDistance(new_objects[j], tracks[i]);
    }
  }
}

  下面我们看下激光测量object与追踪track的之间距离计算方式:关于具体每个权重计算函数可以在distance_collection.h查看。

float MlfTrackObjectDistance::ComputeDistance(
    const TrackedObjectConstPtr& object,
    const MlfTrackDataConstPtr& track) const 
{ 
  // 是否为背景目标
  bool is_background = object->is_background;
  // 获取最新的追踪目标
  const TrackedObjectConstPtr latest_object = track->GetLatestObject().second;
  std::string key = latest_object->sensor_info.name + object->sensor_info.name;
  const std::vector<float>* weights = nullptr;
  if (is_background)  // 判断是否为背景
  {
    auto iter = background_weight_table_.find(key);
    if (iter == background_weight_table_.end()) {
      weights = &kBackgroundDefaultWeight; // 背景权重超参
    } else {
      weights = &iter->second;
    }
  } 
  else {
    auto iter = foreground_weight_table_.find(key);
    if (iter == foreground_weight_table_.end()) {
      weights = &kForegroundDefaultWeight; // 前景超参
    } else {
      weights = &iter->second;
    }
  }
  if (weights == nullptr || weights->size() < 7) {
    AERROR << "Invalid weights";
    return 1e+10f;
  }
  float distance = 0.f;
  float delta = 1e-10f;

  double current_time = object->object_ptr->latest_tracked_time;
  // 根据最新的时间预测
  track->PredictState(current_time);
  // 时间差
  double time_diff =
      track->age_ ? current_time - track->latest_visible_time_ : 0;
  if (weights->at(0) > delta) {
    distance +=
        weights->at(0) * LocationDistance(latest_object, track->predict_.state, object, time_diff); // 位置距离
  }
  if (weights->at(1) > delta) {
    distance +=
        weights->at(1) * DirectionDistance(latest_object, track->predict_.state, object, time_diff); // 朝向距离
  }
  if (weights->at(2) > delta) {
    distance +=
        weights->at(2) * BboxSizeDistance(latest_object, track->predict_.state, object, time_diff); // box size距离计算
  }
  if (weights->at(3) > delta) {
    distance +=
        weights->at(3) * PointNumDistance(latest_object, track->predict_.state, object, time_diff);  // 点云点集距离计算
  }
  if (weights->at(4) > delta) {
    distance +=
        weights->at(4) * HistogramDistance(latest_object, track->predict_.state, object, time_diff); // 直方图距离计算
  }
  if (weights->at(5) > delta) {
    distance += weights->at(5) * CentroidShiftDistance(latest_object,
                                                       track->predict_.state, object, time_diff); // 重心
  }
  if (weights->at(6) > delta) {
    distance += weights->at(6) *
                BboxIouDistance(latest_object, track->predict_.state, object, time_diff, background_object_match_threshold_); // box iou计算权重
  }
  // for foreground, calculate semantic map based distance
  //  if (!is_background) {
  //    distance += weights->at(7) * SemanticMapDistance(*track, object);
  //  }
  return distance;
}

  计算出objecttrackobject之间的权重获取关联矩阵后,判断是否为背景目标,如果是使用GnnBipartiteGraphMatcher。否则使用MultiHmBipartiteGraphMatcher来求解出关联结果。

lidar-tracker代码模块之状态更新

  在上述关联结果之后,需要使用测量值object对追踪列表进行状态更新。从上述结构来看总共有两种滤波方式,MlfShapeFilterMlfMotionFilter。关于object的更新部分主要包括基础的中心点、框大小等。其中MlfMotionFilter更新状态速度使用部分卡尔曼状态监测方式。关于MlfBaseFilter为两者的对外接口类,Apollo通过一个std::vector来存储多种滤波方式,以此来进行不同变量使用不同的滤波方式。请看MlfTracker的初始化代码可以发现:

bool MlfTracker::Init(const MlfTrackerInitOptions options) {
  auto config_manager = lib::ConfigManager::Instance();
  const lib::ModelConfig* model_config = nullptr;
  ACHECK(config_manager->GetModelConfig(Name(), &model_config));
  const std::string work_root = config_manager->work_root();
  std::string config_file;
  std::string root_path;
  ACHECK(model_config->get_value("root_path", &root_path));
  config_file = GetAbsolutePath(work_root, root_path);
  config_file = GetAbsolutePath(config_file, "mlf_tracker.conf");
  MlfTrackerConfig config;
  ACHECK(cyber::common::GetProtoFromFile(config_file, &config));
  // 通过循环访问配置文件里面的filter name来获取多种的filter 
  for (int i = 0; i < config.filter_name_size(); ++i) {
    const auto& name = config.filter_name(i);
    MlfBaseFilter* filter = MlfBaseFilterRegisterer::GetInstanceByName(name);
    ACHECK(filter);
    MlfFilterInitOptions filter_init_options;
    ACHECK(filter->Init(filter_init_options));
    filters_.push_back(filter); // 这里通过filters_来存储多种的滤波方式
    AINFO << "MlfTracker add filter: " << filter->Name();
  }
  return true;
}
lidar-tracker代码模块之measurement

  在measurement测量值模块中,Apollo采用以下几种特征计算:下述方式主要是在运动估计平滑时候使用得到相关的测量值。

void MlfMotionMeasurement::ComputeMotionMeasurment(
    const MlfTrackDataConstPtr& track_data, TrackedObjectPtr new_object) {
  // prefer to choose objects from the same sensor
  std::string sensor_name = new_object->sensor_info.name;
  TrackedObjectConstPtr latest_object =
      track_data->GetLatestSensorObject(sensor_name).second;
  if (latest_object == nullptr) {
    latest_object = track_data->GetLatestObject().second;
  }
  if (latest_object.get() == nullptr) {
    AERROR << "latest_object is not available";
    return;
  }
  // should we estimate the measurement if the time diff is too small?
  double latest_time = latest_object->object_ptr->latest_tracked_time;
  double current_time = new_object->object_ptr->latest_tracked_time;
  double time_diff = current_time - latest_time;
  if (fabs(time_diff) < EPSILON_TIME) {
    time_diff = DEFAULT_FPS;
  }
  // 激光点云轮廓点集的速度测量
  MeasureAnchorPointVelocity(new_object, latest_object, time_diff);
  // 激光点云box的中心点速度测量
  MeasureBboxCenterVelocity(new_object, latest_object, time_diff);
  // 激光点云box的角点速度测量
  MeasureBboxCornerVelocity(new_object, latest_object, time_diff);
  // 测量值的选择 依据运动一致性方法
  MeasurementSelection(track_data, latest_object, new_object);
  // 测量值的质量估计
  MeasurementQualityEstimation(latest_object, new_object);
}

  Apollo激光雷达tracker部分的流程与数据结构大概就是这么多,具体到里面的算法单元细节后续在逐步介绍。本篇主要是讲解其中的数据结构与调用流程,以及相对关键的部分模块细节。

你可能感兴趣的:(自动驾驶技术专栏,自动驾驶)