2
votes

I have a python program where I continuously read the output of other program launched via subprocess.Popen and connected via subprocess.PIPE

The problem I am facing is that it sometime lost significantly portion of the output from the launched program.

For example, monitor for inotify events via a pipe to inotifywait loses many events.

This is the relevant functions:


    process = subprocess.Popen(["inotifywait", "-q", "-r", "-m", 
      "--format", "%e:::::%w%f", srcroot], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
    polling = select.poll()
    polling.register(process.stdout)
    process.stdout.flush()

    while True:
        process.stdout.flush()
        if polling.poll(max_seconds*1000):
            line = process.stdout.readline()
            if len(line) > 0:
                print line[:-1]

Executing the command inotifywait -q -r -m --format %e:::::%w%f /opt/fileserver/ > /tmp/log1 and moving some file around (to generate inotify events) give a >8000 line file. On the other hand, using my ./pscript.py > /tmp/log2 give a file with about 5000 lines.

1
try getting line from stderr as well, and printing that, check if the lost data is actually there. - print process.stderr.read()Anand S Kumar
Unfortunately the above example was somewhat simplified, as I was already checking for stderr. Thank you anyway.shodanshok

1 Answers

1
votes

You're ignoring stderr completely in your example. Try to create the process like this:

process = subprocess.Popen(["inotifywait", "-q", "-r", "-m", 
  "--format", "%e:::::%w%f", srcroot], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

Furthermore, I'd use inotify directly with one of its Python bindings rather than spawning a process with inotifywait.