Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Nick Coghlan
On 27 October 2014 09:44, Paul Moore p.f.mo...@gmail.com wrote:
 I view it as critical (because availability of binaries is *already*
 enough of a problem in the Windows world, without making it worse)
 that we avoid this sort of fragmentation. I'm not seeing an
 acknowledgement from the mingw side that they agree. That's my
 concern. If we both agree, there's nothing to argue about.

I think there's consensus on this front. From Ray: MinGW-w64 assumes
the very old msvcrt.dll files from
Windows XP SP3 and XP64 specifically to avoid this mess.

That assumption will allow MinGW-w64 to link with the appropriate
MSVCRT versions for extention building without anything breaking.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Nick Coghlan
On 27 October 2014 09:37, Paul Moore p.f.mo...@gmail.com wrote:
 On 26 October 2014 23:24, Tony Kelman kel...@berkeley.edu wrote:
 I want, and in many places *need*, an all-MinGW stack.

 OK, I'm willing to accept that statement. But I don't understand it,
 and I don't think you've explained why you *need* your CPython
 interpreter to be compiled with mingw (as opposed to a number of other
 things you might need around building extensions).

I can take a go at an explanation that may make more sense to you.
Consider one of our key requirements for packaging applications for
Fedora: that Fedora builds be *self-hosting*. Given a base Fedora
system, and a source RPM, we need to be able to *build the binary RPM
from source*. (Other Linux distros generally have a similar
requirement)

Relying on opaque binary blobs downloaded from the internet as part of
the build process is not permitted (modulo a few exceptions for
firmware blobs in device drivers).

Now consider that this automatically rebuild the entire system from
source model is not unique to Linux - you can use it for any system
where your build process is sufficiently automated, and you have a
need for it. However, the *structure* of those kind of automation
tends to differ wildly between POSIX style tooling (gcc, clang) and
MSVC. If you have an existing build automation system for *nix
targets, then cross-compilation via MinGW is likely going to be your
smoothest path to adding Windows binary support.

At that point, if CPython is one of your dependencies, you're going to
have the choice of allowing the python.org binaries to be pulled in as
opaque pre-built blobs, or else figuring out how to build an ABI
compatible version with MinGW rather than with MSVC. Think of this
more in the case of *embedding* the CPython runtime in a larger
context (e.g. in Tony's case, to make Python software usable with the
Julia runtime), rather than in building a standalone Python
interpreter for general use.

So, for embedding cases, and for incorporation into POSIX-style build
systems using MinGW-w64 for cross-compilation of Windows binaries, it
may make sense to incorporate the patches that allow building with
MinGW-w64 into mainline CPython (if I recall correctly, we supported
building with Intel's C compiler for a long time, even though we never
shipped anything built with it).

Regards,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
On 27 October 2014 12:30, Nick Coghlan ncogh...@gmail.com wrote:
 OK, I'm willing to accept that statement. But I don't understand it,
 and I don't think you've explained why you *need* your CPython
 interpreter to be compiled with mingw (as opposed to a number of other
 things you might need around building extensions).

 I can take a go at an explanation that may make more sense to you.
 Consider one of our key requirements for packaging applications for
 Fedora: that Fedora builds be *self-hosting*. Given a base Fedora
 system, and a source RPM, we need to be able to *build the binary RPM
 from source*. (Other Linux distros generally have a similar
 requirement)

 Relying on opaque binary blobs downloaded from the internet as part of
 the build process is not permitted (modulo a few exceptions for
 firmware blobs in device drivers).

 Now consider that this automatically rebuild the entire system from
 source model is not unique to Linux - you can use it for any system
 where your build process is sufficiently automated, and you have a
 need for it. However, the *structure* of those kind of automation
 tends to differ wildly between POSIX style tooling (gcc, clang) and
 MSVC. If you have an existing build automation system for *nix
 targets, then cross-compilation via MinGW is likely going to be your
 smoothest path to adding Windows binary support.

 At that point, if CPython is one of your dependencies, you're going to
 have the choice of allowing the python.org binaries to be pulled in as
 opaque pre-built blobs, or else figuring out how to build an ABI
 compatible version with MinGW rather than with MSVC. Think of this
 more in the case of *embedding* the CPython runtime in a larger
 context (e.g. in Tony's case, to make Python software usable with the
 Julia runtime), rather than in building a standalone Python
 interpreter for general use.

 So, for embedding cases, and for incorporation into POSIX-style build
 systems using MinGW-w64 for cross-compilation of Windows binaries, it
 may make sense to incorporate the patches that allow building with
 MinGW-w64 into mainline CPython (if I recall correctly, we supported
 building with Intel's C compiler for a long time, even though we never
 shipped anything built with it).

Thanks Nick. That explanation makes sense to me. I was aware of this
sort of scenario, and as I've said before I don't have any objection
per se to making things easier for people with that sort of
requirement. But some of the other arguments in this thread seemed to
imply more than that. Without specifics, though, I concede that I may
be over-interpreting the rhetoric, so that's the part of the debate
I'm stepping back from, to avoid descending into FUD.

Paul
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Stefan Richthofer

Your test program performs no resurrection of x.

Interestingly, it does not change behavior if you write

class X(object):
def __del__(self):
X.x = self
print ref()

(Thanks for making me aware of this! My test-case was already
initially the more complex one given below)

But if the resurrection occurs indirectly, the weakref persists:
(I refined it to old-style class, because Jython will support new-style
class finalizers only from 2.7. beta 4 onwards, i.e. the test would be
pointless with any current release)

import weakref, time, gc
class ReferentDummy():
pass

class X():
def __del__(self):
X.y = self.z
print __del__: +str(ref())

x = X()
x2 = ReferentDummy()
ref = weakref.ref(x2)
x.z = x2
del x2
del x #Everything is now deleted, isn't it?
gc.collect() #needed in Jython-case
time.sleep(0.2) #wait for Java's async gc to finnish
print ref()
print weakref.getweakrefs(X.y)


---CPython output:
__del__: __main__.ReferentDummy instance at 0x7fd2603e1950
__main__.ReferentDummy instance at 0x7fd2603e1950
[weakref at 0x7fd2603d2c00; to 'instance' at 0x7fd2603e1950]

---Jython 2.7 beta 3 output:
__del__: None
None
[]

One can surely argue x2 has never been dead, or see it as it was killed 
along with x and
then resurrected by x. Jython clearly takes the second point of view 
and also clears weakrefs
to x.z, while CPython does not. Yes, these details probably hardly 
matter in practice (however
could cause subtle bugs when porting complex stuff from CPython to 
Jython), but since I try to

bridge it, I have to look into this more carefully.

Best,

Stefan



On 10/26/2014 06:44 PM, Armin Rigo wrote:

Hi Stefan,

On 26 October 2014 02:50, Stefan Richthofer stefan.richtho...@gmx.de wrote:

It appears weakrefs are only cleared if this is done by gc (where no
resurrection can happen anyway). If a resurrection-performing-__del__ is
just called by ref-count-drop-to-0, weakrefs persist -

How do you reach this conclusion?  The following test program seems to
show the opposite, by printing None on Python 2.7.6:

 import weakref
class X(object):
 def __del__(self):
 print ref()
 x = X()
 ref = weakref.ref(x)
 del x


A bientôt,

Armin.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Antoine Pitrou
On Mon, 27 Oct 2014 14:36:31 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:
 Your test program performs no resurrection of x.
 
 Interestingly, it does not change behavior if you write
 
 class X(object):
  def __del__(self):
  X.x = self
  print ref()
 
 (Thanks for making me aware of this! My test-case was already
 initially the more complex one given below)
 
 But if the resurrection occurs indirectly, the weakref persists:

It's not that resurrection occurs indirectly, it's that the object
pointed to by x2 always remains alive (first as an instance attribute
of x, second as a class attribute of X *before x is deleted*).

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
On 26 October 2014 01:05, Ray Donnelly mingw.andr...@gmail.com wrote:
 Download and run:
 http://sourceforge.net/projects/msys2/files/Base/x86_64/msys2-x86_64-20141003.exe/download

Sending this offline because I really don't want to start up another
extended debate, but is there a version of this that I can use that
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
Please ignore this. I hit the wrong button.

On 27 October 2014 14:18, Paul Moore p.f.mo...@gmail.com wrote:
 On 26 October 2014 01:05, Ray Donnelly mingw.andr...@gmail.com wrote:
 Download and run:
 http://sourceforge.net/projects/msys2/files/Base/x86_64/msys2-x86_64-20141003.exe/download

 Sending this offline because I really don't want to start up another
 extended debate, but is there a version of this that I can use that
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Stefan Richthofer

It's not that resurrection occurs indirectly, it's that the object
pointed to by x2 always remains alive

Yes, this is right for CPython, more precisely this is about the definition
of the word resurrection (in language-independent sense),
which seems not to be unique.

I already pointed out
One can surely argue x2 has never been dead, or see it as
it was killed along with x and then resurrected by x.

In Java and thus in Jython, it is treated as the second one. An equal
program written in Java or Jython would even call the finalizer of x2 (if
it had one) and clear weakrefs before it is available again as a class
attribute of X.
So there actually *is* a notion to refer to this scenario as resurrection.
I admit it is arguable and maybe misleading in CPython case and I was
not aware of the whole behavior when I called the topic resurrection.

What would still be interesting (at least when Jython 3 is born),
is which of the mentioned behaviors occurs if it is
performed by CPython's cyclic gc (consistently the first one I would guess).
As you pointed out, this is only relevant from 3.4 on since in 2.x etc 
it does
not call finalizers in cycles. (Since I mainly work on Jython or Python 
2.7 I
currently have no 3.4 installed to test this instantaneously. I will 
test it someday...)



Best,

Stefan



On 10/27/2014 03:14 PM, Antoine Pitrou wrote:

On Mon, 27 Oct 2014 14:36:31 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:

Your test program performs no resurrection of x.

Interestingly, it does not change behavior if you write

class X(object):
  def __del__(self):
  X.x = self
  print ref()

(Thanks for making me aware of this! My test-case was already
initially the more complex one given below)

But if the resurrection occurs indirectly, the weakref persists:

It's not that resurrection occurs indirectly, it's that the object
pointed to by x2 always remains alive (first as an instance attribute
of x, second as a class attribute of X *before x is deleted*).

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Antoine Pitrou
On Mon, 27 Oct 2014 16:20:06 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:
 
 I already pointed out
 One can surely argue x2 has never been dead, or see it as
 it was killed along with x and then resurrected by x.
 
 In Java and thus in Jython, it is treated as the second one.

You mean Jython deletes instance attributes before calling __del__ ?
That would make most __del__ implementations quite useless...
And actually your own example would fail with an AttributeError on
X.y = self.z.

 What would still be interesting (at least when Jython 3 is born),
 is which of the mentioned behaviors occurs if it is
 performed by CPython's cyclic gc (consistently the first one I would guess).

In which use case exactly? :-) I've lost track a bit, since you've
posted several examples...

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Stefan Richthofer



You mean Jython deletes instance attributes before calling __del__ ?


No. I think the term of object resurrection usually does not mean bringing
back a deleted object in the sense that memory was already freed.
I think it rather means that nothing referred to an object, so it was on the
kill-list of gc or zero-ref-count macro.
During its finalizer call, the object managed to get some reference
again. GC or zero-ref-count macro checks refcount again after the
finalizer call (doesn't it?) and then refrains from the originally triggered
deletion-task.

Where things get weired is how to treat objects (e.g. x2) that are
reachable via the original
object (e.g. x) only.

x becomes unreachable = x2 is unreachable too

CPython behavior:
free x's weakrefs, call x.__del__ = x2 is reachable again = free
memory of x; don't touch x2 at all

Java/Jython behavior:
free all weakrefs, call all finalizers of unreachable objects, i.e. call
x.__del__, call x2.__del__ (and maybe more)
= x2 is reachable again = free memory of x; let x2 survive
(x2 even survives at least for another gc-cycle if the finalizer of x or
x2 only created a weak ref)

At least in Java/Jython case I would call x2 to be resurrected, i.e.
its finalizer was called and weakrefs were cleared. It was on the
death-list and escaped from it. This finally brings the definition of
the word resurrection to its limit in language independent sense as
one can argue there was no resurrection of x2 in CPython although it's
one and the same scenario.


In which use case exactly?

The one with indirect resurrection.
Would it have CPython behavior as sketched above or Java/Jython
behavior? (I confirmed the sketched behavior only for ref-drop-to-zero
triggered cleanup)


Best,

Stefan


On 10/27/2014 04:31 PM, Antoine Pitrou wrote:

On Mon, 27 Oct 2014 16:20:06 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:

I already pointed out
One can surely argue x2 has never been dead, or see it as
it was killed along with x and then resurrected by x.

In Java and thus in Jython, it is treated as the second one.

You mean Jython deletes instance attributes before calling __del__ ?
That would make most __del__ implementations quite useless...
And actually your own example would fail with an AttributeError on
X.y = self.z.


What would still be interesting (at least when Jython 3 is born),
is which of the mentioned behaviors occurs if it is
performed by CPython's cyclic gc (consistently the first one I would guess).

In which use case exactly? :-) I've lost track a bit, since you've
posted several examples...

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Antoine Pitrou
On Mon, 27 Oct 2014 17:23:23 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:
 
 You mean Jython deletes instance attributes before calling __del__ ?
 
 No. I think the term of object resurrection usually does not mean bringing
 back a deleted object in the sense that memory was already freed.
 I think it rather means that nothing referred to an object, so it was on the
 kill-list of gc or zero-ref-count macro.

x2 does *not* have its refcount drop to zero, since it is still
referenced by x. In other words, x2 can only be on a kill list
after x has been finalized, which can only be *after* __del__ was
executed.

If Jython does things differently, then certainly its behaviour is
incompatible with the common expectations of Python developers.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Stefan Richthofer

I already admitted that it is implementation specific whether one would
talk of resurrection, even in one and the same scenario. (Although
I would prefer to agree on an abstract notion of the resurrection term.)


If Jython does things differently, then certainly its behaviour is
incompatible with the common expectations of Python developers.


Guido recently pointed out that it is allowed for different Python
implementations to alter details of gc behavior. (And I suppose this
was more a reminder of already common consensus.)

However I agree that some aspects could be improved and I am looking
at it. So far I have all answers I needed. Thanks for the discussion!


-Stefan





On 10/27/2014 05:36 PM, Antoine Pitrou wrote:

On Mon, 27 Oct 2014 17:23:23 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:

You mean Jython deletes instance attributes before calling __del__ ?

No. I think the term of object resurrection usually does not mean bringing
back a deleted object in the sense that memory was already freed.
I think it rather means that nothing referred to an object, so it was on the
kill-list of gc or zero-ref-count macro.

x2 does *not* have its refcount drop to zero, since it is still
referenced by x. In other words, x2 can only be on a kill list
after x has been finalized, which can only be *after* __del__ was
executed.

If Jython does things differently, then certainly its behaviour is
incompatible with the common expectations of Python developers.

Regards

Antoine.


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de


___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
On 26 October 2014 23:44, Paul Moore p.f.mo...@gmail.com wrote:
 On 26 October 2014 23:11, Ray Donnelly mingw.andr...@gmail.com wrote:
 I don't know where this ABI compatible thing came into being;

 Simple. If a mingw-built CPython doesn't work with the same extensions
 as a MSVC-built CPython, then the community gets fragmented (because
 you can only use the extensions built for your stack). Assuming numpy
 needs mingw and ultimately only gets built for a mingw-compiled Python
 (because the issues building for MSVC-built Python are too hard) and
 assuming that nobody wants to make the effort to build pywin32 under
 mingw, then what does someone who needs both numpy and pywin32 do?

 Avoiding that issue is what I mean by ABI-compatible. (And that's all
 I mean by it, nothing more subtle or controversial).

 I view it as critical (because availability of binaries is *already*
 enough of a problem in the Windows world, without making it worse)
 that we avoid this sort of fragmentation. I'm not seeing an
 acknowledgement from the mingw side that they agree. That's my
 concern. If we both agree, there's nothing to argue about.

I have just done some experiments with building CPython extensions
with mingw-w64. Thanks to Ray for helping me set this up.

The bad news is that the support added to the old 32-bit mingw to
support linking to alternative C runtime libraries (specifically
-lmsvcr100) has bitrotted, and no longer functions correctly in
mingw-w64. As a result, not only can mingw-w64 not build extensions
that are compatible with python.org Python, it can't build extensions
that function at all [1]. They link incompatibly to *both* msvcrt and
msvcr100.

This is a bug in mingw-w64. I have reported it to Ray, who's passed it
onto one of the mingw-w64 developers. But as things stand, mingw
builds will definitely produce binary extensions that aren't
compatible with python.org Python.

Paul

[1] Note, that's if you just use --compiler=mingw32 as supported by
distutils. Looking at how the numpy folks build, they seem to hack
their own version of the distutils C compiler classes. I don't know
whether that's just to work around this bug, or whether they do it for
other reasons as well (but I suspect the latter).
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Antoine Pitrou
On Mon, 27 Oct 2014 18:40:24 +0100
Stefan Richthofer stefan.richtho...@gmx.de wrote:
 If Jython does things differently, then certainly its behaviour is
 incompatible with the common expectations of Python developers.
 
 Guido recently pointed out that it is allowed for different Python
 implementations to alter details of gc behavior.

I'm afraid you misunderstood this whole sub-branch of the discussion.

Regards

Antoine.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] XP buildbot problem cloning from hg.python.org

2014-10-27 Thread David Bolen
Ned Deily n...@acm.org writes:

 Update: after consulting with Donald on IRC, it appears that the problem 
 was on the python.org end and is now fixed.  David, is it now working 
 again for you?

Sorry for the delay - yes, it appears to be working again for me as
well.  And it looks like clones during the buildbot tests were working
again as of tests yesterday.

-- David

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Case Van Horsen
On Mon, Oct 27, 2014 at 10:48 AM, Paul Moore p.f.mo...@gmail.com wrote:
 The bad news is that the support added to the old 32-bit mingw to
 support linking to alternative C runtime libraries (specifically
 -lmsvcr100) has bitrotted, and no longer functions correctly in
 mingw-w64. As a result, not only can mingw-w64 not build extensions
 that are compatible with python.org Python, it can't build extensions
 that function at all [1]. They link incompatibly to *both* msvcrt and
 msvcr100.

 This is a bug in mingw-w64. I have reported it to Ray, who's passed it
 onto one of the mingw-w64 developers. But as things stand, mingw
 builds will definitely produce binary extensions that aren't
 compatible with python.org Python.

 Paul

 [1] Note, that's if you just use --compiler=mingw32 as supported by
 distutils. Looking at how the numpy folks build, they seem to hack
 their own version of the distutils C compiler classes. I don't know
 whether that's just to work around this bug, or whether they do it for
 other reasons as well (but I suspect the latter).
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe: 
 https://mail.python.org/mailman/options/python-dev/casevh%40gmail.com

I've managed to build gmpy2 (which requires GMP, MPFR, and MPC
libraries) using msys2. I've detailed the steps (hacking) at:

https://code.google.com/p/gmpy/source/browse/trunk/msys2_build.txt

One of the hacks I made addresses the linking bug. The extension
does run with the both the 32-bit and 64-bit versions of CPython 2.7,
3.2, 3.3, and 3.4.

It is possible, just not easy. Anything that makes is easier would
be very helpful.

casevh
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
On 27 October 2014 18:47, Case Van Horsen cas...@gmail.com wrote:
 I've managed to build gmpy2 (which requires GMP, MPFR, and MPC
 libraries) using msys2. I've detailed the steps (hacking) at:

 https://code.google.com/p/gmpy/source/browse/trunk/msys2_build.txt

Thanks for this. I don't have the time to read your notes right now,
but I will do so.

 One of the hacks I made addresses the linking bug. The extension
 does run with the both the 32-bit and 64-bit versions of CPython 2.7,
 3.2, 3.3, and 3.4.

Did you report the linking bug to the mingw-w64 project? They key
thing here is that without gcc -lmsvcrt100 foo.c working (i.e., not
resulting in linking with msvcrt), building Python extensions will
always need hacks to workaround that bug.

 It is possible, just not easy. Anything that makes is easier would
 be very helpful.

With the bug fixed, the steps should be as trivial as:

1. Using python.org Python, with gcc on your PATH.
2. Install any dependencies (e.g., gmp) where gcc can see them.
3. python setup.py build_ext --compiler=mingw32 bdist_wheel

(or whatever setup.py invocation suits you, as long as you set
compiler=mingw32).

Paul
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Ray Donnelly
On Sun, Oct 26, 2014 at 11:52 PM,  mar...@v.loewis.de wrote:

 Zitat von Tony Kelman kel...@berkeley.edu:

 A maintainer has volunteered. Others will help. Can any core developers
 please begin reviewing some of his patches?


 Unfortunately, every attempt to review these patches has failed for me,
 every time. In the last iteration of an attempt to add mingw64 support,
 I had asked contributors to also provide instructions on how to use these
 patches, and haven't received any instructions that actually worked.

 I'm hesitant to add code that I cannot verify as actually working.

 I guess anybody else reviewing these patches ran into similar problems
 (I know some other core developers have tried reviewing them as well,
 others have stated here that they are unable to review the patches).


https://mail.python.org/pipermail/python-dev/2014-October/136756.html

 Regards,
 Martin



 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe:
 https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Nathaniel Smith
On Mon, Oct 27, 2014 at 5:48 PM, Paul Moore p.f.mo...@gmail.com wrote:
 On 26 October 2014 23:44, Paul Moore p.f.mo...@gmail.com wrote:
 On 26 October 2014 23:11, Ray Donnelly mingw.andr...@gmail.com wrote:
 I don't know where this ABI compatible thing came into being;

 Simple. If a mingw-built CPython doesn't work with the same extensions
 as a MSVC-built CPython, then the community gets fragmented (because
 you can only use the extensions built for your stack). Assuming numpy
 needs mingw and ultimately only gets built for a mingw-compiled Python
 (because the issues building for MSVC-built Python are too hard) and
 assuming that nobody wants to make the effort to build pywin32 under
 mingw, then what does someone who needs both numpy and pywin32 do?

 Avoiding that issue is what I mean by ABI-compatible. (And that's all
 I mean by it, nothing more subtle or controversial).

 I view it as critical (because availability of binaries is *already*
 enough of a problem in the Windows world, without making it worse)
 that we avoid this sort of fragmentation. I'm not seeing an
 acknowledgement from the mingw side that they agree. That's my
 concern. If we both agree, there's nothing to argue about.

 I have just done some experiments with building CPython extensions
 with mingw-w64. Thanks to Ray for helping me set this up.

 The bad news is that the support added to the old 32-bit mingw to
 support linking to alternative C runtime libraries (specifically
 -lmsvcr100) has bitrotted, and no longer functions correctly in
 mingw-w64. As a result, not only can mingw-w64 not build extensions
 that are compatible with python.org Python, it can't build extensions
 that function at all [1]. They link incompatibly to *both* msvcrt and
 msvcr100.

 This is a bug in mingw-w64. I have reported it to Ray, who's passed it
 onto one of the mingw-w64 developers. But as things stand, mingw
 builds will definitely produce binary extensions that aren't
 compatible with python.org Python.

IIUC, getting mingw-w64 to link against msvcr100 instead of msvcrt
requires a custom mingw-w64 build, because by default mingw-w64's
internal runtime libraries (libgcc etc.) are linked against msvcrt. So
by the time you're choosing compiler switches etc., it's already too
late -- your switches might affect how *your* code is built, but your
code will still be linked against pre-existing runtime libraries that
are linked against msvcrt.

It's possible to hack the mingw-w64 build process to build the runtime
libraries against msvcr100 (or whatever) instead of msvcrt, but this
is still not a panacea -- the different msvcr* libraries are, of
course, incompatible with each other, and IIUC the mingw-w64
developers have never tried to make their libraries work against
anything except msvcrt. For example, mingw-w64's gfortran runtime uses
a symbol that's only available in msvcrt, not msvcr90 or msvcrt100:
  http://sourceforge.net/p/mingw-w64/mailman/message/31768118/

So my impression is that these issues are all fixable, but they will
require real engagement with mingw-w64 upstream.

 [1] Note, that's if you just use --compiler=mingw32 as supported by
 distutils. Looking at how the numpy folks build, they seem to hack
 their own version of the distutils C compiler classes. I don't know
 whether that's just to work around this bug, or whether they do it for
 other reasons as well (but I suspect the latter).

numpy.distutils is a massive pile of hacks to handle all kinds of
weird things including recursive builds, fortran, runtime capability
detection (like autoconf), and every random issue anyone ran into at
some point in the last 10 years and couldn't be bothered filing a
proper upstream bug report. Basically no-one knows what it actually
does -- the source is your only hope :-).

-n

-- 
Nathaniel J. Smith
Postdoctoral researcher - Informatics - University of Edinburgh
http://vorpus.org
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Greg Ewing

Nick Coghlan wrote:

That assumption will allow MinGW-w64 to link with the appropriate
MSVCRT versions for extention building without anything breaking.


If that works, then the same technique should allow CPython
itself to be built in a VS-compatible way with mingw,
shouldn't it?

Those objecting to a mingw-built python seem to be assuming
that such a thing will necessarily be incompatible with
VS builds, but I don't see why that has to be the case.

--
Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
On 27 October 2014 20:45, Greg Ewing greg.ew...@canterbury.ac.nz wrote:
 Nick Coghlan wrote:

 That assumption will allow MinGW-w64 to link with the appropriate
 MSVCRT versions for extention building without anything breaking.


 If that works, then the same technique should allow CPython
 itself to be built in a VS-compatible way with mingw,
 shouldn't it?

Yes.

 Those objecting to a mingw-built python seem to be assuming
 that such a thing will necessarily be incompatible with
 VS builds, but I don't see why that has to be the case.

No, we've been trying to establish whether the patches to build with
mingw were intended to produce such a compatible build. It's not
clear, but so far it seems that apparently that is *not* the intent
(and worse, mingw-w64 may not even be able to build viable executables
that link with msvcr100 without some heavy hacking, although that's
still somewhat unclear).

Paul
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Steve Dower
Greg Ewing wrote:
 Nick Coghlan wrote:
 That assumption will allow MinGW-w64 to link with the appropriate
 MSVCRT versions for extention building without anything breaking.

 If that works, then the same technique should allow CPython itself to be built
 in a VS-compatible way with mingw, shouldn't it?

 Those objecting to a mingw-built python seem to be assuming that such a thing
 will necessarily be incompatible with VS builds, but I don't see why that has 
 to
 be the case.

That's true, and a good point that I missed. However, the main (practical) 
desire for building CPython with something other than VS seems to be to avoid 
having to be compatible with VS.

It's entirely possible that having two alternative builds of CPython would 
force everyone to be more compatible, but I think it's more likely to simply 
end up being two different worlds. Maybe I'm being unnecessarily cynical :)

Cheers,
Steve

 --
 Greg
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Steve Dower
 Paul Moore wrote:
 On 27 October 2014 20:45, Greg Ewing greg.ew...@canterbury.ac.nz wrote:
 Nick Coghlan wrote:

 That assumption will allow MinGW-w64 to link with the appropriate
 MSVCRT versions for extention building without anything breaking.


 If that works, then the same technique should allow CPython itself to
 be built in a VS-compatible way with mingw, shouldn't it?
 
 Yes.
 
 Those objecting to a mingw-built python seem to be assuming that such
 a thing will necessarily be incompatible with VS builds, but I don't
 see why that has to be the case.
 
 No, we've been trying to establish whether the patches to build with mingw 
 were
 intended to produce such a compatible build. It's not clear, but so far it 
 seems
 that apparently that is *not* the intent (and worse, mingw-w64 may not even be
 able to build viable executables that link with msvcr100 without some heavy
 hacking, although that's still somewhat unclear).

Unless there is also opposition to moving to VC14, I'd rather see the mingw 
projects invest in linking to those libraries. I believe they'll have a much 
easier time of it than worrying about VC10, and the investment will be worth 
more in the future as the public API of the CRT stops changing.

Unfortunately, I'm not able to help out more than I've already offered 
(researching answers to specific questions). Largely because I have enough 
work-outside-work going on, but also because my employer won't like me getting 
involved with GPL'd software at all.

Cheers,
Steve

 Paul
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Ray Donnelly
On Mon, Oct 27, 2014 at 8:54 PM, Steve Dower steve.do...@microsoft.com wrote:
 Greg Ewing wrote:
 Nick Coghlan wrote:
 That assumption will allow MinGW-w64 to link with the appropriate
 MSVCRT versions for extention building without anything breaking.

 If that works, then the same technique should allow CPython itself to be 
 built
 in a VS-compatible way with mingw, shouldn't it?

 Those objecting to a mingw-built python seem to be assuming that such a thing
 will necessarily be incompatible with VS builds, but I don't see why that 
 has to
 be the case.

 That's true, and a good point that I missed. However, the main (practical) 
 desire for building CPython with something other than VS seems to be to avoid 
 having to be compatible with VS.

I've no idea where you get that impression from, no one has expressed
anything even approximating that. For me it's to avoid using closed
source software for my hobbyist programming and to help to create a
vibrant Open Source distribution for Windows, because I quite like
Windows; it's got a lot going for it. For others it's to ensure that
everything they care about (CPython with Fortran for example) works
together properly and reliably. I expect that avoiding compatibility
couldn't be further from any of our wishes.


 It's entirely possible that having two alternative builds of CPython would 
 force everyone to be more compatible, but I think it's more likely to simply 
 end up being two different worlds. Maybe I'm being unnecessarily cynical :)

 Cheers,
 Steve

 --
 Greg
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe: 
 https://mail.python.org/mailman/options/python-dev/mingw.android%40gmail.com
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Terry Reedy

On 10/27/2014 12:23 PM, Stefan Richthofer wrote:



You mean Jython deletes instance attributes before calling __del__ ?


No. I think the term of object resurrection usually does not mean
bringing
back a deleted object in the sense that memory was already freed.
I think it rather means that nothing referred to an object, so it was on
the
kill-list of gc or zero-ref-count macro.


In either case, there is a final reference keeping the object alive, 
like an hospital patient kept alive by a final link with a life-support 
machine.  I think 'resuscitation' might be a better metaphor.


--
Terry Jan Reedy

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Paul Moore
On 27 October 2014 21:19, Steve Dower steve.do...@microsoft.com wrote:
 No, we've been trying to establish whether the patches to build with mingw 
 were
 intended to produce such a compatible build. It's not clear, but so far it 
 seems
 that apparently that is *not* the intent (and worse, mingw-w64 may not even 
 be
 able to build viable executables that link with msvcr100 without some heavy
 hacking, although that's still somewhat unclear).

 Unless there is also opposition to moving to VC14, I'd rather see the mingw
 projects invest in linking to those libraries. I believe they'll have a much 
 easier
 time of it than worrying about VC10, and the investment will be worth more in
 the future as the public API of the CRT stops changing.

I think the point is that anything other than msvcrt is extra work,
because using msvcrt is coded into the guts of gcc (which in turn is
because msvcrt is apparently OK to consider as part of the OS in GPL
legality terms). So whether it's the vc10 libraries or the vc14 ones
is irrelevant - and mingw ships with the vc10 link library, so it's
easier to discuss the problem in terms of vc10. But yes, vc14 would be
the long term target.

Of course if the vc14 libs were deemed as shipped with the OS and/or
were named msvcrt.dll, then that would be different. But I assume
that's not what will happen.

Paul
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Devin Jeanpierre
On Sun, Oct 26, 2014 at 3:41 PM, Paul Moore p.f.mo...@gmail.com wrote:
 Not really, to be honest. I still don't understand why anyone not
 directly involved in CPython development would need to build their own
 Python executable on Windows.

Late Python bugfix releases are source-only, so if you suffer from a
bug and need to get it fixed, you need to build Python from source.

https://www.python.org/download/releases/2.6.9/ has no windows binary
and includes several security fixes.

-- Devin
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes break on object resurrection

2014-10-27 Thread Stefan Richthofer
I think 'resuscitation' might be a better metaphor.

The term 'resurrection' is not my invention, but well established:
http://en.wikipedia.org/wiki/Object_resurrection

I well understand why Antoine objects to calling it resurrection in CPython 
due to
implementation specific reasons. But in the above article (which I consider 
rather
detailed) I can't find anything stating that an object's ref-count must drop to 
zero
at any time in order to call it resurrected. In contrast, it clarifies that 
objects
can not only resurrect themselves:
...which may in turn make that object or another garbage object (reachable 
from the
object with a finalizer) reachable again
If this happens, the referenced object – which is not necessarily the finalized
object – is no longer garbage, and cannot be deallocated

 x2 does *not* have its refcount drop to zero, since it is still
 referenced by x. In other words, x2 can only be on a kill list
 after x has been finalized, which can only be *after* __del__ was
 executed.

x resurrects x2 in the sense that it must actively have an action
in its finalizer that establishes a new reference to x2 in non-garbage or
environment memory. Otherwise x as the final life support link of x2
would cause x2's ref count *actually* drop to zero in the next step.

I never wanted this to become a discussion about the definition of object
resurrection. I just wanted to understand which details in different
behavior (such as weakref breaking) are okay and which are bugs (as
breaking consistency of id() in Jython).


Regards

-Stefan



 Gesendet: Montag, 27. Oktober 2014 um 23:36 Uhr
 Von: Terry Reedy tjre...@udel.edu
 An: python-dev@python.org
 Betreff: Re: [Python-Dev] results of id() and weakref.getweakrefs() sometimes 
 break on object resurrection

 On 10/27/2014 12:23 PM, Stefan Richthofer wrote:
 
  You mean Jython deletes instance attributes before calling __del__ ?
 
  No. I think the term of object resurrection usually does not mean
  bringing
  back a deleted object in the sense that memory was already freed.
  I think it rather means that nothing referred to an object, so it was on
  the
  kill-list of gc or zero-ref-count macro.
 
 In either case, there is a final reference keeping the object alive, 
 like an hospital patient kept alive by a final link with a life-support 
 machine.  I think 'resuscitation' might be a better metaphor.
 
 -- 
 Terry Jan Reedy
 
 ___
 Python-Dev mailing list
 Python-Dev@python.org
 https://mail.python.org/mailman/listinfo/python-dev
 Unsubscribe: 
 https://mail.python.org/mailman/options/python-dev/stefan.richthofer%40gmx.de

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Status of C compilers for Python on Windows

2014-10-27 Thread Stephen J. Turnbull
R. David Murray writes:
  On Sun, 26 Oct 2014 00:19:44 +0200, Antoine Pitrou solip...@pitrou.net 
  wrote:

   My point is that your Windows build would not have the same behaviour
   as a MSVC-produced Windows build, and so testing it with it would not
   certify that your code would actually be compatible with genuine
   MSVC builds of CPython, which we will not stop supporting.
  
  While true, I don't think that matters for Chris' point.

[...]

  If I could use a more linux-like toolchain to build CPython on windows,
  I would doubtless do much more testing on windows for stuff where I
  think windows might behave differently (and I might look at more Windows
  bugs...though frankly there are plenty of bugs for me to look at without
  looking at Windows bugs).
  
  This is not necessarily a compelling argument for MinGW support.
  However, it *is* a valid argument, IMO.

Nobody claims that the there are not arguments, even compelling
arguments, for MinGW support (more generally, support for alternative
toolchains).

But there are *also* compelling arguments for *supporting* *both*
those no need to worry about mixed ABIs situations and *mixed*
situations.  And that becomes Python Dev's problem if the patches are
added to core Python.  Currently, they're somebody else's problem, and
that's as it should be at this stage.

Python is open source.  Nobody is objecting to somebody else doing
this.[1]  The problem here is simply that some somebody elses are
trying to throw future work over the wall into python-dev space.
There is nothing wrong with that, either -- that's why there is a
stdlib, for example -- but the python-dev concerns about platform
fragmentation are genuine (even if not applicable to all potential
users of the alternative toolchains), and substantial resources will
be needed to do the testing required to meet python-dev's requirement
that such code be *binary* compatible with other binaries downloaded
for Windows, as well as for maintenance of the code itself.


Footnotes: 
[1]  Some *do* question whether there's a need for anybody to do this,
and that's bogus.  I just wanna is good enough reason to do it.  The
issue here is that it's not good enough reason for python-dev to do
the support and maintenance going forward.

___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com