一般编写简单的python脚本用不上日志,直接print更简单直接。但如果稍微复杂一些,需要记录程序执行的情况,print就难以胜任。使用日志代替print可以记录程序的执行方法和执行时间,记录到文件也可以方便后续定位问题。
logging 日志配置:
# -*- coding: utf-8 -*-
import logging
import os
import time
# 初始化日志
def init_logger(_log_dir, _logger_name):
# 创建日志目录
if not os.path.exists(_log_dir):
os.mkdir(_log_dir)
_logger = logging.getLogger(_logger_name)
_logger.setLevel(logging.INFO)
formatter = logging.Formatter(fmt='%(asctime)s %(levelname)-4s %(processName)s %(thread)d %(funcName)s %(message)s',
datefmt='%Y-%m-%d %H:%M:%S')
# 输出到文件
file_handler = logging.FileHandler(
'{}/{}_{}.log'.format(_log_dir, _logger_name, time.strftime("%Y%m%d_%H%M%S", time.localtime(time.time()))),
mode='w',
encoding='utf-8')
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
_logger.addHandler(file_handler)
# 输出到控制台
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
console_handler.setLevel(logging.INFO)
_logger.addHandler(console_handler)
if __name__ == "__main__":
# 使用文件名作为日志名
log_name = __file__.split("/")[-1].split("\\")[-1].split(".")[0]
init_logger("logs", log_name)
logger = logging.getLogger(log_name)
logger.info("start")
# do_task()
logger.info("end")
multiprocessing 日志配置
在使用multiprocessing多进程时,logger无法在多个进程间共享,一般需要每个进程单独初始化logger,这样就会生成多个日志文件,打印顺序也会混乱。 如果需要将多个子进程的日志也输出到主进程的日志,那就需要使用到消息队列了。logging模块提供了QueueListener、QueueHandler用于日志合并处理。
# -*- coding: utf-8 -*-
import logging
import multiprocessing as mp
import os
import time
from logging.handlers import QueueListener, QueueHandler
logger_name = __file__.split("/")[-1].split("\")[-1].split(".")[0]
# 初始化日志
def setup_logger(_log_dir, _logger_name):
if not os.path.exists(_log_dir):
os.mkdir(_log_dir)
log_queue = mp.Queue()
_logger = logging.getLogger(_logger_name)
_logger.setLevel(logging.INFO)
formatter = logging.Formatter(fmt='%(asctime)s %(levelname)-4s %(processName)s %(thread)d %(funcName)s %(message)s',
datefmt='%Y-%m-%d %H:%M:%S')
# 输出到文件
file_handler = logging.FileHandler(
'{}/{}_{}.log'.format(_log_dir, _logger_name, time.strftime("%Y%m%d_%H%M%S", time.localtime(time.time()))),
mode='w',
encoding='utf-8')
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
_logger.addHandler(file_handler)
# 输出到控制台
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
console_handler.setLevel(logging.INFO)
_logger.addHandler(console_handler)
# 创建QueueListener在主进程中处理日志
queue_listener = QueueListener(log_queue, file_handler, console_handler)
queue_listener.start()
return log_queue, queue_listener
def init_worker(_logger_name, _log_queue):
# 每个子进程设置QueueHandler
queue_handler = QueueHandler(_log_queue)
_logger = logging.getLogger(_logger_name)
_logger.setLevel(logging.INFO)
_logger.addHandler(queue_handler)
def task(n):
sub_logger = logging.getLogger(logger_name)
sub_logger.info("n:{}".format(n))
time.sleep(1)
if __name__ == "__main__":
log_queue, listener = setup_logger("logs", logger_name)
logger = logging.getLogger(logger_name)
logger.info("start")
pool = mp.Pool(processes=4, initializer=init_worker, initargs=(logger_name, log_queue))
result = pool.map(task, range(10))
pool.close()
pool.join()
listener.stop()
logger.info("end")
使用multiprocessing 多进程输出日志如下:
2025-08-04 13:49:02 INFO MainProcess 18880 <module> start
2025-08-04 13:49:04 INFO SpawnPoolWorker-1 6736 task n:0
2025-08-04 13:49:04 INFO SpawnPoolWorker-2 30224 task n:1
2025-08-04 13:49:04 INFO SpawnPoolWorker-3 30448 task n:2
2025-08-04 13:49:04 INFO SpawnPoolWorker-4 28484 task n:3
2025-08-04 13:49:05 INFO SpawnPoolWorker-2 30224 task n:5
2025-08-04 13:49:05 INFO SpawnPoolWorker-1 6736 task n:4
2025-08-04 13:49:05 INFO SpawnPoolWorker-3 30448 task n:6
2025-08-04 13:49:05 INFO SpawnPoolWorker-4 28484 task n:7
2025-08-04 13:49:06 INFO SpawnPoolWorker-2 30224 task n:8
2025-08-04 13:49:06 INFO SpawnPoolWorker-1 6736 task n:9
2025-08-04 13:49:07 INFO MainProcess 18880 <module> end