即插即用!保存训练过程日志(logger)

问题描述

深度学习train.py文件中需要在日志中记录每轮训练的loss值并在控制台输出

方法

在train文件的同级目录新建一个set_logger.py文件,代码如下:

import logging

def set_logger1(log_path='../logs/process.log'):
    """Set the logger to log info in terminal and file `log_path`.
    In general, it is useful to have a logger so that every output to the
    terminal is saved in a permanent file.
    Example:
    ```
    logging.info("Starting training...")
    ```
    Args:
        log_path: (string) where to log
    """
    logger = logging.getLogger()
    logger.setLevel(logging.INFO)

    if not logger.handlers:
        # Logging to a file
        file_handler = logging.FileHandler(log_path)
        file_handler.setFormatter(logging.Formatter('%(asctime)s:%(levelname)s:'
                                                    ' %(message)s'))
        logger.addHandler(file_handler)
        # Logging to console
        stream_handler = logging.StreamHandler()
        stream_handler.setFormatter(logging.Formatter('%(message)s'))
        logger.addHandler(stream_handler)

然后在train.py文件中添加代码:

from set_logger import set_logger1
set_logger1()
logging.basicConfig(format='%(asctime)s %(message)s', level=logging.INFO)
logging.info('Epoch [{}/{}], Loss: {:.4f},lr:{}'.format(
                epoch + 1,
                epoch_number,
                loss.item(),
                optimizer.state_dict()['param_groups'][0]['lr'])
            )

如果在不同的函数中使用logging.info()出现无法打印日志的情况,在打印不出来的logging.info()前面添加

 root_logger = logging.getLogger()
        for h in root_logger.handlers:
            root_logger.removeHandler(h)
        set_logger1()
        logging.basicConfig(format='%(asctime)s %(message)s', level=logging.INFO)

你可能感兴趣的:(python)