The Surprising Effectiveness of Visual Odometry Techniques for Embodied PointGoal Navigation 代码复现

代码地址

https://github.com/Xiaoming-Zhao/PointNav-VO

环境配置

1.拉取github库

git clone https://github.com/Xiaoming-Zhao/PointNav-VO.git
cd PointNav-VO

2.创建环境
修改environment.yml

name: sup_nav
channels:
  - pytorch
  - conda-forge
dependencies:
  - python=3.7
  - cmake=3.14.0
  - numpy
  - numba
  - tqdm
  - tbb
  - joblib
  - h5py
  - pytorch=1.7.0
  - torchvision=0.8.0
  - cudatoolkit=11.0
  - pip
  - pip:
    - yacs
    - lz4
    - opencv-python
conda env create -f environment.yml
conda activate sup_nav

3.安装habitat-sim

conda install -c aihabitat -c conda-forge habitat-sim=0.1.7 headless

4.安装habitat-lab

git clone https://github.com/facebookresearch/habitat-lab.git -b v0.1.7
pip install -r requirements.txt
python setup.py develop --all

5.配置数据集和模型
直接参考原本github库的Dataset

6.运行eval脚本
创建run.sh

# cd /path/to/this/repo
export POINTNAV_VO_ROOT=$PWD

export NUMBA_NUM_THREADS=1 && \
export NUMBA_THREADING_LAYER=workqueue && \
# conda activate pointnav-vo && \
python ${POINTNAV_VO_ROOT}/launch.py \
--repo-path ${POINTNAV_VO_ROOT} \
--n_gpus 1 \
--task-type rl \
--noise 1 \
--run-type eval \
--addr 127.0.1.1 \
--port 8338
bash run.sh

The Surprising Effectiveness of Visual Odometry Techniques for Embodied PointGoal Navigation 代码复现_第1张图片
跑起来呢~~~~
第一次把navigation的代码run起来,纪念一下

你可能感兴趣的:(代码复现,Navigation,python)