Using asyncio for this is a good possibility I was not aware of.
My best try with asyncio was:
import asyncio
async def run_command():
# Create subprocess
process = await asyncio.create_subprocess_exec(
'./test.sh',
stdout=asyncio.subprocess.PIPE, # Redirect stdout to a pipe
stderr=asyncio.subprocess.PIPE # Redirect stderr to a pipe
)
# Read stdout and stderr asynchronously
captured_output = b''
async for line in process.stdout:
print(line.decode().strip())
captured_output += line
async for line in process.stderr:
print(line.decode().strip())
captured_output += line
await process.wait()
print(captured_output)
# Run the asyncio event loop
asyncio.run(run_command())
########################################
This fulfills all my requirements. A nice to have would that the
captured_output has not to be constructed with += 's but with a final seek(0)
and read() of process.stdout. But I didn't find anything how to rewind the
stream, that i can read the whole output again.
Another question is, if this solution is deadlock proof.
Thank you all for the already very valuable input!
Greetings,
Horst
--
https://mail.python.org/mailman/listinfo/python-list