![python subprocess get output python subprocess get output](https://alexandra-zaharia.github.io/assets/img/posts/colored_logging.png)
It's still a race condition, but there are two "servers", so it's normally not a big deal. Having two threads (now I remember) gives an approximate synchronization by writing the data as it becomes available. The problem is that we're dealing with two different buffers (pipes). I think if we only used one thread underneath for the logging, the problem would be solved. I'm pretty sure the reason is that we're using two separate threads. The two streams still end up not being synchronized correctly. We need the file descriptor to be closed in the parent process (i.e. Setting close_fds=True for the subprocess call (which is actually the default) won't help because that causes the file descriptor to be closed in the forked (child) process before calling exec. I used different names in a couple of spots, but otherwise it's the same idea, except a little cleaner and more robust. With subprocess.Popen(, stdout=logpipe, stderr=logpipe) as s: Logging.log(self.level, line.strip('\n')) """Return the write file descriptor of the pipeįor line in iter(, ''): """Setup the object with a logger and a loglevel Logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.INFO) Implementing all of these changes gives: import logging Lastly, the while loop can be simplified using a for loop. If you properly cleanup as a described, all the threads will exit properly removing the need to mark them as daemons. This becomes a major problem after a while because you'll reach the limit on the number of open file descriptors.Īlso, the thread shouldn't be a daemon thread because that creates the risk of losing log data during process shutdown. Otherwise, the main process will keep the write end of the pipe open forever, causing readline to block indefinitely, which will cause your thread to live forever as well as the pipe. That way when the child process exits and closes it's end of the pipes, the logging thread will get a SIGPIPE and return a zero length message as you expected. Basically, you need to close the write end of the pipes after passing them to the subprocess. Your method for doing cleanup though is wrong (as you mentioned it might be). I was having the same problem and this helped me solve it. I would also like to know if there is already a similar object in the Python library since I found nothing to accomplish this task! I would now like to see your comments and enhancement proposal. I also suppose that having a file open in write mode but two different entities is not permitted / not a sane thing to do.
![python subprocess get output python subprocess get output](https://media.geeksforgeeks.org/wp-content/uploads/20200923192309/Screenshot204-660x631.png)
This is not an option for me since I need to use the formatting and loglevel facilities of the logging module. Subprocess.call(, stdout = logWrap, stderr = logWrap)įor logging subprocesses' output, Google suggests to directly redirect the output to a file in a way similar to this: sobprocess.call( stdout = open( 'logfile.log', 'w') ) LogWrap = LoggerWrapper( logging, logging.INFO) Logging.basicConfig(filename='command.log',level=logging.INFO) # Remove the trailing newline character frm the string # WARNING: I don't know if this method is correct, # If the line read is empty the pipe has been # when the pipe is close that is when a call to # Endless loop, the method will exit this loop only NB the trailing newline character of the string Simply read from the pipe (using a file-like This is the method executed by the thread, it
![python subprocess get output python subprocess get output](https://i.ytimg.com/vi/Ko-nSWId_wU/maxresdefault.jpg)
Return the write file descriptor of the pipe # of the pipe, this has been done to simplify read operations
![python subprocess get output python subprocess get output](http://blog.pragbits.com/assets/2015-07-01-python-dialog-and-debinterface/pydialog-interface-03.png)
# Create a file-like wrapper around the read file descriptor # Create the pipe and store read and write file descriptors # Set the logger object where messages will be redirected # Make the thread a Daemon Thread (program will exit when only daemon Setup the object with a logger and a loglevel The object itself is able to supply a file To a logger (see python's logger module), Read text message from a pipe and redirect them
#Python subprocess get output code
To solve this problem I wrote the following code which basically creates a pipe and uses a thread to read from the pipe and generates a log message using the Python logging method: import subprocess I did not find any other method, but if there is one please let me know! The difficulty I faced is that with subprocess I can redirect stdout and stderr only using a file descriptor. The subprocess is created using the subprocess.call() method. I'm working on a Python script and I was searching for a method to redirect stdout and stderr of a subprocess to the logging module.