The concurrent-log-handler seems to do the job perfectly. Tested on Windows. Supports also POSIX systems.
Main idea
- Create a separate file with a function that returns a logger. The logger must have fresh instance of 
ConcurrentRotatingFileHandler for each process. Example function get_logger() given below. 
- Creating loggers is done at the initialization of the process. For a 
multiprocessing.Process subclass it would mean the beginning of the run() method. 
Detailed instructions
I this example, I will use the following file structure
.
│-- child.py        <-- For a child process
│-- logs.py         <-- For setting up the logs for the app
│-- main.py         <-- For a main process
│-- myapp.py        <-- For starting the app
│-- somemodule.py   <-- For an example, a "3rd party module using standard logging"
Code
Child process
# child.py 
import multiprocessing as mp
import time
from somemodule import do_something
class ChildProcess(mp.Process):
    def __init__(self):
        self.logger = None
        super().__init__()
    def run(self):
        from logs import get_logger
        self.logger = get_logger()
        while True:
            time.sleep(1)
            self.logger.info("Child process")
            do_something()
- Simple child process that inherits 
multiprocessing.Process and simply logs to file text "Child process" 
- Important: The 
get_logger() is called inside the run(), or elsewhere inside the child process (not module level or in __init__().) This is required as get_logger() creates ConcurrentRotatingFileHandler instance, and new instance is needed for each process. 
- The 
do_something is used just to demonstrate that this works with 3rd party library code which does not have any clue that you are using concurrent-log-handler. 
Main Process
# main.py
import logging
import multiprocessing as mp
import time
from child import ChildProcess
from somemodule import do_something
class MainProcess(mp.Process):
    def __init__(self):
        self.logger = logging.getLogger()
        super().__init__()
    def run(self):
        from logs import get_logger
        self.logger = get_logger()
        self.child = ChildProcess()
        self.child.daemon = True
        self.child.start()
        while True:
            time.sleep(0.5)
            self.logger.critical("Main process")
            do_something()
- The main process that logs into file two times a second "Main process". Also inheriting from 
multiprocessing.Process. 
- Same comments for 
get_logger() and do_something() apply as for the child process. 
Logger setup
# logs.py
import logging
import os
from concurrent_log_handler import ConcurrentRotatingFileHandler
LOGLEVEL = logging.DEBUG
def get_logger():
    logger = logging.getLogger()
    if logger.handlers:
        return logger
    # Use an absolute path to prevent file rotation trouble.
    logfile = os.path.abspath("mylog.log")
    logger.setLevel(LOGLEVEL)
    # Rotate log after reaching 512K, keep 5 old copies.
    filehandler = ConcurrentRotatingFileHandler(
        logfile, mode="a", maxBytes=512 * 1024, backupCount=5, encoding="utf-8"
    )
    filehandler.setLevel(LOGLEVEL)
    # create also handler for displaying output in the stdout
    ch = logging.StreamHandler()
    ch.setLevel(LOGLEVEL)
    formatter = logging.Formatter(
        "%(asctime)s - %(module)s - %(levelname)s - %(message)s [Process: %(process)d, %(filename)s:%(funcName)s(%(lineno)d)]"
    )
    # add formatter to ch
    ch.setFormatter(formatter)
    filehandler.setFormatter(formatter)
    logger.addHandler(ch)
    logger.addHandler(filehandler)
    return logger
- This uses the 
ConcurrentRotatingFileHandler  from the concurrent-log-handler package. Each process needs a fresh ConcurrentRotatingFileHandler instance. 
- Note that all the arguments for the 
ConcurrentRotatingFileHandler should be the same in every process. 
Example app
# myapp.py 
if __name__ == "__main__":
    from main import MainProcess
    p = MainProcess()
    p.start()
- Just a simple example on how to start the multiprocess application
 
Example of 3rd party module using standard logging
# somemodule.py 
import logging
logger = logging.getLogger("somemodule")
def do_something():
    logging.info("doing something")
- Just a simple example to test if loggers from 3rd party code will work normally.
 
Example output
2021-04-19 19:02:29,425 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:29,427 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:29,929 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:29,931 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:30,133 - child - INFO - Child process [Process: 76700, child.py:run(18)]
2021-04-19 19:02:30,137 - somemodule - INFO - doing something [Process: 76700, somemodule.py:do_something(7)]
2021-04-19 19:02:30,436 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:30,439 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:30,944 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:30,946 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
2021-04-19 19:02:31,142 - child - INFO - Child process [Process: 76700, child.py:run(18)]
2021-04-19 19:02:31,145 - somemodule - INFO - doing something [Process: 76700, somemodule.py:do_something(7)]
2021-04-19 19:02:31,449 - main - CRITICAL - Main process [Process: 103348, main.py:run(23)]
2021-04-19 19:02:31,451 - somemodule - INFO - doing something [Process: 103348, somemodule.py:do_something(7)]
 
				
multiprocessing.get_logger()? It seems based on these other ways of doing logging are the logging functionality inmultiprocessingof little value. - Tim Ludwinskiget_logger()is the logger used bymultiprocessingmodule itself. It is useful if you want to debug amultiprocessingissue. - jfs