pytorch-lightning踩坑记录

pytorch-lightning==1.5.10,python3.7,torch==1.10.0

20220421

1

module 'torchmetrics' has no attribute 'F1'

改成:

torchmetrics.F1Score()

 2

You have asked for `amp_level='O2'` but it's only supported with `amp_backend='apex'

尝试添加amp_backend: apex到Trainer里,出现:

You have asked for Apex AMP but you have not installed it. Install `apex` using this guide: https://github.com/NVIDIA/apex

参考这条安装:安装apex

根据官方文档:

  • amp_level (Optional[str]) – The optimization level to use (O1, O2, etc…). By default it will be set to “O2” if amp_backend is set to “apex”.

expected scalar type Float but found Half

 输出数据类型是torch.float32明明!

跑不下去了,先放个坑

20220430

猜测是显卡不能跑混合精度,改成precision=32,可以正常运行了

20220712

--gpus=    (有几个卡就写几,要指定卡的话可以在前面加上CUDA_VISIBLE_DEVICES=0,1..指定0和1两个卡,gpus=2

你可能感兴趣的:(python)