The concurrent-log-handler seems to do the job perfectly. Tested on Windows. Supports also POSIX systems.
Main idea
- Create a separate file with a function that returns a logger. The logger must have fresh instance of
ConcurrentRotatingFileHandler
for each process. Example function get_logger()
given below.
- Creating loggers is done at the initialization of the process. For a
multiprocessing.Process
subclass it would mean the beginning of the run()
method.
Detailed instructions
I this example, I will use the following file structure
.
│
│
│
│
│
Code
Child process
import multiprocessing as mp
import time
from somemodule import do_something
class ChildProcess(mp.Process):
def __init__(self):
self.logger = None
super().__init__()
def run(self):
from logs import get_logger
self.logger = get_logger()
while True:
time.sleep(1)
self.logger.info("Child process")
do_something()
- Simple child process that inherits
multiprocessing.Process
and simply logs to file text "Child process"
- Important: The
get_logger()
is called inside the run()
, or elsewhere inside the child process (not module level or in __init__()
.) This is required as get_logger()
creates ConcurrentRotatingFileHandler
instance, and new instance is needed for each process.
- The
do_something
is used just to demonstrate that this works with 3rd party library code which does not have any clue that you are using concurrent-log-handler.
Main Process
import logging
import multiprocessing as mp
import time
from child import ChildProcess
from somemodule import do_something
class MainProcess(mp.Process):
def __init__(self):
self.logger = logging.getLogger()
super().__init__()
def run(self):
from logs import get_logger
self.logger = get_logger()
self.child = ChildProcess()
self.child.daemon = True
self.child.start()
while True:
time.sleep(0.5)
self.logger.critical("Main process")
do_something()
- The main process that logs into file two times a second "Main process". Also inheriting from
multiprocessing.Process
.
- Same comments for
get_logger()
and do_something()
apply as for the child process.
Logger setup
import logging
import os
from concurrent_log_handler import ConcurrentRotatingFileHandler
LOGLEVEL = logging.DEBUG
def get_logger():
logger = logging.getLogger()
if logger.handlers:
return logger
logfile = os.path.abspath("mylog.log")
logger.setLevel(LOGLEVEL)
filehandler = ConcurrentRotatingFileHandler(
logfile, mode="a", maxBytes=512 * 1024, backupCount=5, encoding="utf-8"
)
filehandler.setLevel(LOGLEVEL)
ch = logging.StreamHandler()
ch.setLevel(LOGLEVEL)
formatter = logging.Formatter(
"%(asctime)s - %(module)s - %(levelname)s - %(message)s [Process: %(process)d, %(filename)s:%(funcName)s(%(lineno)d)]"
)
ch.setFormatter(formatter)
filehandler.setFormatter(formatter)
logger.addHandler(ch)
logger.addHandler(filehandler)
return logger
- This uses the
ConcurrentRotatingFileHandler
from the concurrent-log-handler package. Each process needs a fresh ConcurrentRotatingFileHandler instance.
- Note that all the arguments for the
ConcurrentRotatingFileHandler
should be the same in every process.
Example app
if __name__ == "__main__":
from main import MainProcess
p = MainProcess()
p.start()
- Just a simple example on how to start the multiprocess application
Example of 3rd party module using standard logging
import logging
logger = logging.getLogger("somemodule")
def do_something():
logging.info("doing something")
- Just a simple example to test if loggers from 3rd party code will work normally.
Example output
2021-04-19 19:02:29,425 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:29,427 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:29,929 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:29,931 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:30,133 - child - INFO - Child process [Process: 76700, child.py:run(18)]
2021-04-19 19:02:30,137 - somemodule - INFO - doing something [Process: 76700, somemodule.py:do_something(7)]
2021-04-19 19:02:30,436 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:30,439 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:30,944 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:30,946 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:31,142 - child - INFO - Child process [Process: 76700, child.py:run(18)]
2021-04-19 19:02:31,145 - somemodule - INFO - doing something [Process: 76700, somemodule.py:do_something(7)]
2021-04-19 19:02:31,449 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:31,451 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
multiprocessing.get_logger()
? It seems based on these other ways of doing logging are the logging functionality inmultiprocessing
of little value. – Tim Ludwinskiget_logger()
is the logger used bymultiprocessing
module itself. It is useful if you want to debug amultiprocessing
issue. – jfs