On 6 May 2018 at 05:34, Brett Cannon <br...@python.org> wrote:

>
>
> On Sat, 5 May 2018 at 10:41 Eric Fahlgren <ericfahlg...@gmail.com> wrote:
>
>> On Sat, May 5, 2018 at 10:30 AM, Toshio Kuratomi <a.bad...@gmail.com>
>> wrote:
>>
>>> On Fri, May 4, 2018, 7:00 PM Nathaniel Smith <n...@pobox.com> wrote:
>>>
>>>> What are the obstacles to including "preloaded" objects in regular .pyc
>>>> files, so that everyone can take advantage of this without rebuilding the
>>>> interpreter?
>>>>
>>>
>>> Would this make .pyc files arch specific?
>>>
>>
>> Or have parallel "pyh" (Python "heap") files, that are architecture
>> specific...
>>
>
> .pyc files have tags to specify details about them (e.g. were they
> compiled with -OO), so this isn't an "all or nothing" option, nor does it
> require a different file extension. There just needs to be an appropriate
> finder that knows how to recognize a .pyc file with the appropriate tag
> that can be used, and then a loader that knows how to read that .pyc.
>

Right, this is the kind of change I had in mind (perhaps in combination
with Diana Clarke's suggestion from several months back to make pyc tagging
more feature-flag centric, rather than the current focus on a numeric
optimisation level).

We also wouldn't ever generate this hypothetical format implicitly -
similar to the new deterministic pyc's in 3.7, they'd be something you had
to explicitly request via a compileall invocation. In the Linux distro use
case then, the relevant distro packaging helper scripts and macros could
generate traditional cross-platform pyc files for no-arch packages, but
automatically switch to the load-time optimised arch-specific format if the
package was already arch-specific.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
_______________________________________________
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com

Reply via email to