Eryk Sun <[email protected]> added the comment:
> It's *very* unlikely you'll ever get output that doesn't fit into MBCS,
When writing to a pipe, wmic.exe hard codes using the process OEM code page
(i.e. CP_OEMCP). If it matters, running wmic.exe with subprocess should use
encoding='oem' instead of text=True.
That said, wmic.exe is deprecated. I suggest using PowerShell instead. For
example:
import os
import json
import subprocess
cmd = 'Get-CimInstance Win32_OperatingSystem | Select Caption, Version |
ConvertTo-Json'
p = subprocess.run(f'powershell.exe -c "{cmd}"', capture_output=True,
encoding=os.device_encoding(1))
result = json.loads(p.stdout)
PowerShell uses the console's output code page (i.e. os.device_encoding(1))
when writing to stdout, even if it's a pipe. (If PowerShell is run without a
console via DETACHED_PROCESS, then it outputs nothing to stdout.) The only way
I know of to make PowerShell write UTF-8 to stdout when it's a pipe is by
temporarily changing the console output code page. Assuming the current process
has a console, you have to first get the current code page with
GetConsoleOutputCP(). Change the code page to UTF-8 via
SetConsoleOutputCP(CP_UTF8). Run the PowerShell command. Finally, restore the
original code page.
Maybe subprocess should provide a context manager to set the console code pages
before a call, and restore the previous console code pages and console modes
after a call completes. That's what CLI shells such as CMD do when running an
external program.
----------
_______________________________________
Python tracker <[email protected]>
<https://bugs.python.org/issue45382>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe:
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com