0
votes

I have a script which can run on my host machine and several other servers. I want to launch this script as a background process on my host machine along with the remote machine using ssh and output the stdout/stderr to host machine for my host machine background process and on the remote machines for remote machine background tasks.

I tried with

subprocess.check_output(['python' ,'script.py' ,'arg_1', ' > file.log ', ' & echo -ne $! ']

but it doesn't work. it doesnt give me the pid nor write into the file. It works with shell=True but then I read it is not good to use shell=True for security reasons.

then I tried

p = subprocess.Popen(['python' ,'script.py' ,'arg_1', ' > file.log ']

Now i can get the process pid but the output is not writing in the remote log file.

using stdout/stderr arguments like suggested below will open the log file in my host machine not the remote machine. i want to log on the remote machine instead. append subprocess.Popen output to file?

Could someone please suggest me a single command that works both on my host machine and also ssh's to remote server and launches the background process there? and write to output file ?

<HOW_TO_GET_PID> = subprocess.<WHAT>( ([] if 'localhost' else ['ssh','<remote_server>']) + ['python', 'script.py', 'arg_1' <WHAT>] )

Someone could please finish the above psudo code ?

Thanks,

2
Have you tried capturing the output using stdout=PIPE, stderr=PIPE then .communicate()? - S3DEV
stackoverflow.com/a/7224186/4772933 communicate is blocking and like I have told I want to launch a background task.. So cant use it - user4772933
>, &, etc. are all shell directives. They're meaningless unless passed to a shell. - Charles Duffy
However, you don't need those directives: You can tell subprocess to do the same thing directly. For example, stdout=open('somefile', 'w') instead of putting >somefile in the command. - Charles Duffy
BTW, note that requests for "one line" answers typically compromise readability, correctness, or both. Stack Overflow's scope limits it to practical questions; code that isn't readable or correct is not practical to put to real-world mission-critical use. - Charles Duffy

2 Answers

0
votes

You're not going to get something that's safe and correct in a one-liner without making it unreadable; better not to try.

Note that we're using a shell here: In the local case we explicitly call shell=True, whereas in the remote case ssh always, implicitly starts a shell.

import shlex
import subprocess

def startBackgroundCommand(argv, outputFile, remoteHost=None, andGetPID=False):
    cmd_str = ' '.join(shlex.quote(word) for word in argv)
    if outputFile != None:
        cmd_str += ' >%s' % (shlex.quote(outputFile),)
    if andGetPID:
        cmd_str += ' & echo "$!"'
    if remoteHost != None:
        p = subprocess.Popen(['ssh', remoteHost, cmd_str], stdout=subprocess.PIPE)
    else:
        p = subprocess.Popen(cmd_str, stdout=subprocess.PIPE, shell=True)
    return p.communicate()[0]

# Run your command locally
startBackgroundCommand(['python', 'script.py', 'arg_1'],
    outputFile='file.log', andGetPID=True)

# Or run your command remotely
startBackgroundCommand(['python', 'script.py', 'arg_1'],
    remoteHost='foo.example.com', outputFile='file.log', andGetPID=True)
0
votes

# At the beginning you can even program automatic daemonizing
# Using os.fork(), otherwise, you run it with something like:
# nohup python run_my_script.py &
# This will ensure that it continues running even if SSH connection breaks.
from subprocess import Popen, PIPE, STDOUT

p = Popen(["python", "yourscript.py"], stdout=PIPE, stderr=STDOUT, stdin=PIPE)
p.stdin.close()
log = open("logfile.log", "wb")
log.write(b"PID: %i\n\n" % p.pid)
while 1:
    line = p.stdout.readline()
    if not line: break
    log.write(line)
    log.flush()

p.stdout.close()
log.write(b"\nExit status: %i" % p.poll())
log.close()