Python Subprocess Won't Interleave Stderr And Stdout As What Terminal Does
Solution 1:
In the C libraries (and thus c-based python), streams are handled differently depending on whether they are attached to interactive terminals (or something pretending to be) or not. For a tty
, stdout
is line buffered, otherwise its block buffered and only flushed to the file descriptor when some block boundary is hit. When you redirected to a PIPE, the stream is no longer a tty and block buffering is in effect.
The solution is to reopen stdout
specifying that you want line buffering (1) regardless. At the C level, stderr
is always line buffered but when I tested just reopening stdout
the program acted as though stderr
is block buffered. I was quite surprised. Maybe this is the intermediate io.TextIO
layer or some other odd thing, but I found I needed to fix both pipes.
Even though stdout
and stderr
go to the same pipe, they are separate file descriptors with separate buffers as far as the executed program is concerned. That's why interleaving doesn't happen naturally in the output buffer even in block mode.
#!/usr/bin/env python3
import sys
import os
# reopen stdout line buffered
sys.stdout = os.fdopen(sys.stdout.fileno(), 'w', 1)
# this surprises me, seems like we have to reopen stderr
# line buffered, but i thought it was line buffered anywy.
# perhaps its the intermediate python TextIO layer?
sys.stderr = os.fdopen(sys.stderr.fileno(), 'w', 1)
count = 0
sys.stderr.write('stderr, order %d\n' % count)
count += 1
sys.stdout.write('stdout, order %d\n' % count)
count += 1
sys.stderr.write('stderr, order %d\n' % count)
count += 1
sys.stdout.write('stdout, order %d\n' % count)
Post a Comment for "Python Subprocess Won't Interleave Stderr And Stdout As What Terminal Does"