Re: [Python-Dev] PEP 396, Module Version Numbers
On Thu, Apr 7, 2011 at 2:55 PM, Glenn Linderman v+pyt...@g.nevcal.com wrote: __version__ = 7.9.7 # replaced by packaging If you don't upload your module to PyPI, then you can do whatever you want with your versioning info. If you *do* upload it to PyPI, then part of doing so properly is to package it so that your metadata is where other utilities expect it to be. At that point, you can move the version info over to setup.cfg and add the code into the module to read it from the metadata store. The PEP doesn't mention PyPI, and at present none of the modules there use packaging :) They all use distutils (or setuptools or distutils2) though, which is what packaging replaces. (Sorry for not making that clear - it's easy to forget which aspects of these issues aren't common knowledge as yet) So it wasn't obvious to me that the PEP applies only to PyPI, and I have used modules that were not available from PyPI yet were still distributed and packaged somehow (not using packaging clearly). packaging is the successor to the current distutils package. Distribution via PyPI is the main reason to bother with creating a correctly structured package - for internal distribution, people use all sorts of ad hoc schemes (often just the packaging systems of their internal target platforms). I'll grant that some people do use properly structured packages for purely internal use, but I'd also be willing to bet that they're the exception rather than the rule. What I would like to see the PEP say is that if you don't *have* a setup.cfg file, then go ahead and embed the version directly in your Python source file. If you *do* have one, then put the version there and retrieve it with pkgutil if you want to provide a __version__ attribute. Barry is welcome to make a feature request to allow that dependency to go the other way, with the packaging system reading the version number out of the source file, but such a suggestion doesn't belong in an Informational PEP. If such a feature is ever accepted, then the recommendation in the PEP could be updated. While there has been much effort (discussion by many) to make packaging useful to many, and that is probably a good thing, I still wonder why a packaging system should be loaded into applications when all the code has already been installed. Or is the runtime of packaging packaged so that only a small amount of code has to be loaded to obtain version and __version__? I don't recall that being discussed on this list, but maybe it has been on more focused lists, sorry for my ignorance... but I also read about embedded people complaining about how many files Python opens at start up, and see no need for a full packaging system to be loaded, just to do version checking. pkgutil will be able to read the metadata - it is a top level standard library module, *not* a submodule of distutils/packaging. It may make sense for the version parsing support to be in pkgutil as well, since PEP 345 calls for it to be stored as a string in the package metadata, but it needs to be converted with NormalizedVersion to be safe to use in arbitrary version range checks. That's Tarek's call as to whether to provide it that way, or as a submodule of packaging. As you say, the fact that distutils/packaging are usually first on the chopping block when distros are looking to save space is a strong point in favour of having that particular functionality somewhere else. That said, I've seen people have problems because a Python 2.6 redistributor decided contextlib wasn't important and left it out, so YMMV regardless of where the code ends up. Cheers, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 399: Pure Python/C Accelerator Module Compatibiilty Requirements
On Thu, Apr 7, 2011 at 3:15 PM, Stefan Behnel stefan...@behnel.de wrote: Assuming there always is an equivalent Python implementation anyway, what about using that as a fallback for input types that the C implementation cannot deal with? Or would it be a larger surprise for users if the code ran slower when passing in a custom type than if it throws an exception instead? It often isn't practical - the internal structures of the two don't necessarily play nicely together. It's an interesting idea for heapq in particular, though. (The C module fairly could easily alias the Python versions with underscore prefixes, then fallback to those instead of raising an error if PyList_CheckExact fails). Cheers, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On 4/6/2011 11:53 PM, Nick Coghlan wrote: On Thu, Apr 7, 2011 at 2:55 PM, Glenn Lindermanv+pyt...@g.nevcal.com wrote: __version__ = 7.9.7 # replaced by packaging If you don't upload your module to PyPI, then you can do whatever you want with your versioning info. If you *do* upload it to PyPI, then part of doing so properly is to package it so that your metadata is where other utilities expect it to be. At that point, you can move the version info over to setup.cfg and add the code into the module to read it from the metadata store. The PEP doesn't mention PyPI, and at present none of the modules there use packaging :) They all use distutils (or setuptools or distutils2) though, which is what packaging replaces. (Sorry for not making that clear - it's easy to forget which aspects of these issues aren't common knowledge as yet) I knew that packaging replaced those others, but was unaware that those were the only two methods used on PyPI. Not that I'd heard of or experienced any others from that source, but there are many packages there. So it wasn't obvious to me that the PEP applies only to PyPI, and I have used modules that were not available from PyPI yet were still distributed and packaged somehow (not using packaging clearly). packaging is the successor to the current distutils package. Distribution via PyPI is the main reason to bother with creating a correctly structured package - for internal distribution, people use all sorts of ad hoc schemes (often just the packaging systems of their internal target platforms). I'll grant that some people do use properly structured packages for purely internal use, but I'd also be willing to bet that they're the exception rather than the rule. What I would like to see the PEP say is that if you don't *have* a setup.cfg file, then go ahead and embed the version directly in your Python source file. If you *do* have one, then put the version there and retrieve it with pkgutil if you want to provide a __version__ attribute. Barry is welcome to make a feature request to allow that dependency to go the other way, with the packaging system reading the version number out of the source file, but such a suggestion doesn't belong in an Informational PEP. If such a feature is ever accepted, then the recommendation in the PEP could be updated. While there has been much effort (discussion by many) to make packaging useful to many, and that is probably a good thing, I still wonder why a packaging system should be loaded into applications when all the code has already been installed. Or is the runtime of packaging packaged so that only a small amount of code has to be loaded to obtain version and __version__? I don't recall that being discussed on this list, but maybe it has been on more focused lists, sorry for my ignorance... but I also read about embedded people complaining about how many files Python opens at start up, and see no need for a full packaging system to be loaded, just to do version checking. pkgutil will be able to read the metadata - it is a top level standard library module, *not* a submodule of distutils/packaging. It may make sense for the version parsing support to be in pkgutil as well, since PEP 345 calls for it to be stored as a string in the package metadata, but it needs to be converted with NormalizedVersion to be safe to use in arbitrary version range checks. That's Tarek's call as to whether to provide it that way, or as a submodule of packaging. As you say, the fact that distutils/packaging are usually first on the chopping block when distros are looking to save space is a strong point in favour of having that particular functionality somewhere else. This sounds more practical; if I recall prior discussions correctly, pkgutil reads a standard set of metadata data packaging systems should provide, and version would seem to be part of that, more so than of packaging itself... seems things would have a better (smaller at runtime) dependency tree that way, from what I understand about it. That said, I've seen people have problems because a Python 2.6 redistributor decided contextlib wasn't important and left it out, so YMMV regardless of where the code ends up. :) Cheers, Nick Thanks Nick, for the info in this thread. This is mostly a thank you note for helping me understand better. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On Thu, Apr 7, 2011 at 5:05 PM, Glenn Linderman v+pyt...@g.nevcal.com wrote: On 4/6/2011 11:53 PM, Nick Coghlan wrote: They all use distutils (or setuptools or distutils2) though, which is what packaging replaces. (Sorry for not making that clear - it's easy to forget which aspects of these issues aren't common knowledge as yet) I knew that packaging replaced those others, but was unaware that those were the only two methods used on PyPI. Not that I'd heard of or experienced any others from that source, but there are many packages there. I believe it is possible to get stuff up onto PyPI without actually using one of the various packaging utilities, but such entries generally won't play well with others (including automated tools like pip and cheesecake). Cheers, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On 06/04/2011 15:26, Nick Coghlan wrote: On Wed, Apr 6, 2011 at 6:22 AM, Glenn Lindermanv+pyt...@g.nevcal.com wrote: With more standardization of versions, should the version module be promoted to stdlib directly? When Tarek lands packaging (i.e. what distutils2 becomes in the Python 3.3 stdlib), the standardised version handling will come with it. On 4/5/2011 11:52 AM, Barry Warsaw wrote: DEFAULT_VERSION_RE = re.compile(r'(?Pversion\d+\.\d(?:\.\d+)?)') __version__ = pkgutil.get_distribution('elle').metadata['version'] I really dislike this way of specifying the version. For a start it is really ugly. More importantly it means the version information is *only* available if the package has been installed by packaging, and so isn't available for the parts of my pre-build process like building the documentation (which import the version number to put into the docs). Currently all my packages have the canonical version number information in the package itself using: __version__ = '1.2.3' Anything that needs the version number, including setup.py for upload to pypi, has one place to look for it and it doesn't depend on any other tools or processes. If switching to packaging prevents me from doing this then it will inhibit me using packaging. What I may have to do is use a python script that will generate the static metadata, which is not such a bad thing I guess as it will only need to be executed at package build time. I won't be switching to that horrible technique for specifying versions within my packages though. All the best, Michael The RE as given won't match alpha, beta, rc, dev, and post suffixes that are discussed in POP 386. Indeed, I really don't like the RE suggestion - better to tell people to just move the version info into the static config file and use pkgutil to make it available as shown. That solves the build time vs install time problem as well. Nor will it match the code shown and quoted for the alternative distutils2 case. Other comments: Are there issues for finding and loading multiple versions of the same module? No, you simply can't do it. Python's import semantics are already overly complicated even without opening that particular can of worms. Should it be possible to determine a version before loading a module? If yes, the version module would have to be able to find a parse version strings in any of the many places this PEP suggests they could be... so that would be somewhat complex, but the complexity shouldn't be used to change the answer... but if the answer is yes, it might encourage fewer variant cases to be supported for acceptable version definition locations for this PEP. Yep, this is why the version information should be in the setup.cfg file, and hence available via pkgutil without loading the module first. Cheers, Nick. -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On 07/04/2011 12:10, Michael Foord wrote: On 06/04/2011 15:26, Nick Coghlan wrote: On Wed, Apr 6, 2011 at 6:22 AM, Glenn Lindermanv+pyt...@g.nevcal.com wrote: With more standardization of versions, should the version module be promoted to stdlib directly? When Tarek lands packaging (i.e. what distutils2 becomes in the Python 3.3 stdlib), the standardised version handling will come with it. On 4/5/2011 11:52 AM, Barry Warsaw wrote: DEFAULT_VERSION_RE = re.compile(r'(?Pversion\d+\.\d(?:\.\d+)?)') __version__ = pkgutil.get_distribution('elle').metadata['version'] I really dislike this way of specifying the version. For a start it is really ugly. More importantly it means the version information is *only* available if the package has been installed by packaging, and so isn't available for the parts of my pre-build process like building the documentation (which import the version number to put into the docs). And in fact it would make the module itself unimportable unless installed by packaging, so not compatible with other installation methods (including the ever-loved 'just drop it somewhere on sys.path) or earlier versions of Python that don't have the required apis (or don't have packaging installed). So I don't think recommending pkgutil.get_distribution('elle').metadata['version'] as a way for packages to provide version information is good advice. All the best, Michael Foord Currently all my packages have the canonical version number information in the package itself using: __version__ = '1.2.3' Anything that needs the version number, including setup.py for upload to pypi, has one place to look for it and it doesn't depend on any other tools or processes. If switching to packaging prevents me from doing this then it will inhibit me using packaging. What I may have to do is use a python script that will generate the static metadata, which is not such a bad thing I guess as it will only need to be executed at package build time. I won't be switching to that horrible technique for specifying versions within my packages though. All the best, Michael The RE as given won't match alpha, beta, rc, dev, and post suffixes that are discussed in POP 386. Indeed, I really don't like the RE suggestion - better to tell people to just move the version info into the static config file and use pkgutil to make it available as shown. That solves the build time vs install time problem as well. Nor will it match the code shown and quoted for the alternative distutils2 case. Other comments: Are there issues for finding and loading multiple versions of the same module? No, you simply can't do it. Python's import semantics are already overly complicated even without opening that particular can of worms. Should it be possible to determine a version before loading a module? If yes, the version module would have to be able to find a parse version strings in any of the many places this PEP suggests they could be... so that would be somewhat complex, but the complexity shouldn't be used to change the answer... but if the answer is yes, it might encourage fewer variant cases to be supported for acceptable version definition locations for this PEP. Yep, this is why the version information should be in the setup.cfg file, and hence available via pkgutil without loading the module first. Cheers, Nick. -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On Thu, 07 Apr 2011 12:10:59 +0100 Michael Foord fuzzy...@voidspace.org.uk wrote: On 06/04/2011 15:26, Nick Coghlan wrote: On Wed, Apr 6, 2011 at 6:22 AM, Glenn Lindermanv+pyt...@g.nevcal.com wrote: With more standardization of versions, should the version module be promoted to stdlib directly? When Tarek lands packaging (i.e. what distutils2 becomes in the Python 3.3 stdlib), the standardised version handling will come with it. On 4/5/2011 11:52 AM, Barry Warsaw wrote: DEFAULT_VERSION_RE = re.compile(r'(?Pversion\d+\.\d(?:\.\d+)?)') __version__ = pkgutil.get_distribution('elle').metadata['version'] I really dislike this way of specifying the version. For a start it is really ugly. Agreed, it is incredibly obscure and unpleasantly opaque. Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On Thu, Apr 7, 2011 at 9:10 PM, Michael Foord fuzzy...@voidspace.org.uk wrote: I really dislike this way of specifying the version. For a start it is really ugly. More importantly it means the version information is *only* available if the package has been installed by packaging, and so isn't available for the parts of my pre-build process like building the documentation (which import the version number to put into the docs). Currently all my packages have the canonical version number information in the package itself using: __version__ = '1.2.3' Anything that needs the version number, including setup.py for upload to pypi, has one place to look for it and it doesn't depend on any other tools or processes. If switching to packaging prevents me from doing this then it will inhibit me using packaging. What I may have to do is use a python script that will generate the static metadata, which is not such a bad thing I guess as it will only need to be executed at package build time. I won't be switching to that horrible technique for specifying versions within my packages though. It sounds like part of the PEP needs another trip through distutils-sig. An informational PEP really shouldn't be advocating standard library changes, but it would make sense for this point of view to inform any updates to PEP 386 (the version handling standardisation PEP). As I see it, there appear to be two main requests: 1. Normalised version parsing and comparison should be available even if packaging itself is not installed (e.g. as part of pkgutil) 2. packaging should support extraction of the version metadata from the source files when bundling a package for distribution On point 2, rather than requiring that it be explicitly requested, I would suggest the following semantics for determining the version when bundling a package ready for distribution: - if present in the metadata, use that - if not present in the metadata, look for __version__ in the module source code (or the __init__ source code for an actual package) - otherwise warn the developer that no version information has been provided so it is automatically being set to 0.0.0a0 Cheers, Nick. -- Nick Coghlan | ncogh...@gmail.com | Brisbane, Australia ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On 07/04/2011 12:59, Nick Coghlan wrote: On Thu, Apr 7, 2011 at 9:10 PM, Michael Foordfuzzy...@voidspace.org.uk wrote: I really dislike this way of specifying the version. For a start it is really ugly. More importantly it means the version information is *only* available if the package has been installed by packaging, and so isn't available for the parts of my pre-build process like building the documentation (which import the version number to put into the docs). Currently all my packages have the canonical version number information in the package itself using: __version__ = '1.2.3' Anything that needs the version number, including setup.py for upload to pypi, has one place to look for it and it doesn't depend on any other tools or processes. If switching to packaging prevents me from doing this then it will inhibit me using packaging. What I may have to do is use a python script that will generate the static metadata, which is not such a bad thing I guess as it will only need to be executed at package build time. I won't be switching to that horrible technique for specifying versions within my packages though. It sounds like part of the PEP needs another trip through distutils-sig. An informational PEP really shouldn't be advocating standard library changes, but it would make sense for this point of view to inform any updates to PEP 386 (the version handling standardisation PEP). As I see it, there appear to be two main requests: 1. Normalised version parsing and comparison should be available even if packaging itself is not installed (e.g. as part of pkgutil) 2. packaging should support extraction of the version metadata from the source files when bundling a package for distribution On point 2, rather than requiring that it be explicitly requested, I would suggest the following semantics for determining the version when bundling a package ready for distribution: - if present in the metadata, use that - if not present in the metadata, look for __version__ in the module source code (or the __init__ source code for an actual package) - otherwise warn the developer that no version information has been provided so it is automatically being set to 0.0.0a0 This sounds good to me. As an added consideration the suggested technique may not work for tools like py2exe / py2app, embedded python and alternative implementations - which may not have the full pacakging machinery available. All the best, Michael Foord Cheers, Nick. -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 396, Module Version Numbers
On Wed, Apr 06, 2011 at 11:04:08AM +0200, John Arbash Meinel wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 ... #. ``__version_info__`` SHOULD be of the format returned by PEP 386's ``parse_version()`` function. The only reference to parse_version in PEP 386 I could find was the setuptools implementation which is pretty odd: In other words, parse_version will return a tuple for each version string, that is compatible with StrictVersion but also accept arbitrary version and deal with them so they can be compared: from pkg_resources import parse_version as V V('1.2') ('0001', '0002', '*final') V('1.2b2') ('0001', '0002', '*b', '0002', '*final') V('FunkyVersion') ('*funkyversion', '*final') Barry -- I think we want to talk about NormalizedVersion.from_parts() rather than parse_version(). bzrlib has certainly used 'version_info' as a tuple indication such as: version_info = (2, 4, 0, 'dev', 2) and version_info = (2, 4, 0, 'beta', 1) and version_info = (2, 3, 1, 'final', 0) etc. This is mapping what we could sort out from Python's sys.version_info. The *really* nice bit is that you can do: if sys.version_info = (2, 6): # do stuff for python 2.6(.0) and beyond nod People like to compare versions and the tuple forms allow that. Note that the tuples you give don't compare correctly. This is the order that they sort: (2, 4, 0) (2, 4, 0, 'beta', 1) (2, 4, 0, 'dev', 2) (2, 4, 0, 'final', 0) So that means, snapshot releases will always sort after the alpha and beta releases (and release candidate if you use 'c' to mean release candidate). Since the simple (2, 4, 0) tuple sorts before everything else, a comparison that doesn't work with the 2.4.0-alpha (or beta or arbitrary dev snapshots) would need to specify something like: (2, 4, 0, 'z') NormalizedVersion.from_parts() uses nested tuples to handle this better. But I think that even with nested tuples a naive comparison fails since most of the suffixes are prerelease strings. ie: ((2, 4, 0),) ((2, 4, 0), ('beta', 1)) So you can't escape needing a function to compare versions. (NormalizedVersion does this by letting you compare NormalizedVersions together). Barry if this is correct, maybe __version_info__ is useless and I shouldn't have brought it up at pycon? -Toshio pgpztjMBlMddF.pgp Description: PGP signature ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] Test cases not garbage collected after run
I actually created a bug entry for this (http://bugs.python.org/issue11798) and just later it occurred that I should've asked in the list first :) So, here's the text for opinions: Right now, when doing a test case, one must clear all the variables created in the test class, and I believe this shouldn't be needed... E.g.: class Test(TestCase): def setUp(self): self.obj1 = MyObject() ... def tearDown(self): del self.obj1 Ideally (in my view), right after running the test, it should be garbage-collected and the explicit tearDown just for deleting the object wouldn't be needed (as the test would be garbage-collected, that reference would automatically die), because this is currently very error prone... (and probably a source of leaks for any sufficiently big test suite). If that's accepted, I can provide a patch. Thanks, Fabio ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Force build form
On Thu, Apr 7, 2011 at 12:40 AM, Antoine Pitrou solip...@pitrou.net wrote: For the record, I've tried to make the force build form clearer on the buildbot Web UI. See e.g.: http://www.python.org/dev/buildbot/all/builders/x86%20OpenIndiana%20custom Cool. I've recently discovered buildbot page for twisted. It is more convenient to have build request form on the right. http://buildbot.twistedmatrix.com/builders/winxp32-py2.6-msi/ -- anatoly t. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
On Thu, Apr 7, 2011 at 7:01 AM, Benjamin Peterson benja...@python.org wrote: 2011/4/6 anatoly techtonik techto...@gmail.com: Is it a good idea to have code highlighting in tracker? Why would we need it? Because tracker is ugly. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
2011/4/7 anatoly techtonik techto...@gmail.com: On Thu, Apr 7, 2011 at 7:01 AM, Benjamin Peterson benja...@python.org wrote: 2011/4/6 anatoly techtonik techto...@gmail.com: Is it a good idea to have code highlighting in tracker? Why would we need it? Because tracker is ugly. So we should add some highlighted code to spice it up? :) -- Regards, Benjamin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
On Thu, Apr 7, 2011 at 7:01 AM, Benjamin Peterson benja...@python.org wrote: 2011/4/6 anatoly techtonik techto...@gmail.com: Is it a good idea to have code highlighting in tracker? Why would we need it? Because tracker is ugly. That's not a good enough reason. I'm -1 on adding this: it's yet another thing to maintain, and adding markup to the tracker would increase the mental burden for using it. Eric. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Test cases not garbage collected after run
On 07/04/2011 17:18, Fabio Zadrozny wrote: I actually created a bug entry for this (http://bugs.python.org/issue11798) and just later it occurred that I should've asked in the list first :) So, here's the text for opinions: Right now, when doing a test case, one must clear all the variables created in the test class, and I believe this shouldn't be needed... E.g.: class Test(TestCase): def setUp(self): self.obj1 = MyObject() ... def tearDown(self): del self.obj1 Ideally (in my view), right after running the test, it should be garbage-collected and the explicit tearDown just for deleting the object wouldn't be needed (as the test would be garbage-collected, that reference would automatically die), because this is currently very error prone... (and probably a source of leaks for any sufficiently big test suite). If that's accepted, I can provide a patch. You mean that the test run keeps the test instances alive for the whole test run so instance attributes are also kept alive. How would you solve this - by having calling a TestSuite (which is how a test run is executed) remove members from themselves after each test execution? (Any failure tracebacks etc stored by the TestResult would also have to not keep the test alive.) My only concern would be backwards compatibility due to the change in behaviour. All the best, Michael Foord Thanks, Fabio ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
On Thu, Apr 7, 2011 at 11:22, anatoly techtonik techto...@gmail.com wrote: On Thu, Apr 7, 2011 at 7:01 AM, Benjamin Peterson benja...@python.org wrote: 2011/4/6 anatoly techtonik techto...@gmail.com: Is it a good idea to have code highlighting in tracker? Why would we need it? Because tracker is ugly. It's a bug tracker, not a Myspace profile. Unless the lack of syntax highlighting is causing a work stoppage, I don't think we need this. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
On Apr 7, 2011, at 9:22 AM, anatoly techtonik wrote: On Thu, Apr 7, 2011 at 7:01 AM, Benjamin Peterson benja...@python.org wrote: 2011/4/6 anatoly techtonik techto...@gmail.com: Is it a good idea to have code highlighting in tracker? +0 That has its highpoints; * give tracker entries a more professional appearance closer to what is done on code paste sites, code viewers, and wikis * provide a clean way to post code snippets (we've had past issues with whitespace being gobbled-up) The downsides: * it would probably need a preview button and markup help screen * it's just one more thing to learn and maintain * many ways to do it (code paste, rietveld, attaching a patch, plain text, etc) * smells of feature creep Raymond ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
On Thu, Apr 7, 2011 at 12:57 PM, Raymond Hettinger raymond.hettin...@gmail.com wrote: .. * provide a clean way to post code snippets (we've had past issues with whitespace being gobbled-up) What would really help is if someone would figure out how to stop the tracker from removing the lines that start with the python prompt from comments sent by e-mail. http://psf.upfronthosting.co.za/roundup/meta/issue321 http://psf.upfronthosting.co.za/roundup/meta/issue264 ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Test cases not garbage collected after run
On Fri, Apr 8, 2011 at 4:49 AM, Michael Foord fuzzy...@voidspace.org.uk wrote: You mean that the test run keeps the test instances alive for the whole test run so instance attributes are also kept alive. How would you solve this - by having calling a TestSuite (which is how a test run is executed) remove members from themselves after each test execution? (Any failure tracebacks etc stored by the TestResult would also have to not keep the test alive.) My only concern would be backwards compatibility due to the change in behaviour. An alternative is in TestCase.run() / TestCase.__call__(), make a copy and immediately delegate to it; that leaves the original untouched, permitting run-in-a-loop style helpers to still work. Testtools did something to address this problem, but I forget what it was offhand. -Rob ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy for making changes to the AST
AFAIK the AST is CPython-specific so should be treated with the same attitude as changes to the bytecode. That means, do it conservatively, since there *are* people who like to write tools that manipulate or analyze this, and while they know they're doing something CPython and version-specific, they should not be broken by bugfix releases, since the people who *use* their code probably have no idea of the deep magic they're depending on. PyPy implements exactly the same AST. I think Jython also does, although I'm not that sure. There were already issues with say subclassing ast nodes were pypy was incompatible from CPython. That said, it's completely fine from PyPy's perspective to change AST between major releases. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
Hi Daniel, Thanks for putting this together. I am a huge supporter of benchmarking efforts. My brief comment is below. On Wed, Apr 6, 2011 at 11:52 AM, DasIch dasdas...@googlemail.com wrote: 1. Definition of the benchmark suite. This will entail contacting developers of Python implementations (CPython, PyPy, IronPython and Jython), via discussion on the appropriate mailing lists. This might be achievable as part of this proposal. If you are reaching out to other projects at this stage, I think you should also be in touch with the Cython people (even if its 'implementation' sits on top of CPython). As a scientist/engineer what I care about is how Cython benchmarks to CPython. I believe that they have some ideas on benchmarking and have also explored this space. Their inclusion would be helpful to me thinking this GSoC successful at the end of the day (summer). Thanks for your consideration. Be Well Anthony 2. Implementing the benchmark suite. Based on the prior agreed upon definition, the suite will be implemented, which means that the benchmarks will be merged into a single mercurial repository on Bitbucket[5]. 3. Porting the suite to Python 3.x. The suite will be ported to 3.x using 2to3[6], as far as possible. The usage of 2to3 will make it easier make changes to the repository especially for those still focusing on 2.x. It is to be expected that some benchmarks cannot be ported due to dependencies which are not available on Python 3.x. Those will be ignored by this project to be ported at a later time, when the necessary requirements are met. Start of Program (May 24) == Before the coding, milestones 2 and 3, can begin it is necessary to agree upon a set of benchmarks, everyone is happy with, as described. Midterm Evaluation (July 12) === During the midterm I want to finish the second milestone and before the evaluation I want to start in the third milestone. Final Evaluation (Aug 16) = In this period the benchmark suite will be ported. If everything works out perfectly I will even have some time left, if there are problems I have a buffer here. Probably Asked Questions == Why not use one of the existing benchmark suites for porting? The effort will be wasted if there is no good base to build upon, creating a new benchmark suite based upon the existing ones ensures that. Why not use Git/Bazaar/...? Mercurial is used by CPython, PyPy and is fairly well known and used in the Python community. This ensures easy accessibility for everyone. What will happen with the Repository after GSoC/How will access to the repository be handled? I propose to give administrative rights to one or two representatives of each project. Those will provide other developers with write access. Communication = Communication of the progress will be done via Twitter[7] and my blog[8], if desired I can also send an email with the contents of the blog post to the mailing lists of the implementations. Furthermore I am usually quick to answer via IRC (DasIch on freenode), Twitter or E-Mail(dasdas...@gmail.com) if anyone has any questions. Contact to the mentor can be established via the means mentioned above or via Skype. About Me My name is Daniel Neuhäuser, I am 19 years old and currently a student at the Bergstadt-Gymnasium Lüdenscheid[9]. I started programming (with Python) about 4 years ago and became a member of the Pocoo Team[10] after successfully participating in the Google Summer of Code last year, during which I ported Sphinx[11] to Python 3.x and implemented an algorithm to diff abstract syntax trees to preserve comments and translated strings which has been used by the other GSoC projects targeting Sphinx. .. [1]: https://bitbucket.org/pypy/benchmarks/src .. [2]: http://code.google.com/p/unladen-swallow/ .. [3]: http://hg.python.org/benchmarks/file/tip/performance .. [4]: http://hg.python.org/benchmarks/file/62e754c57a7f/performance/README .. [5]: http://bitbucket.org/ .. [6]: http://docs.python.org/library/2to3.html .. [7]: http://twitter.com/#!/DasIch .. [8]: http://dasdasich.blogspot.com/ .. [9]: http://bergstadt-gymnasium.de/ .. [10]: http://www.pocoo.org/team/#daniel-neuhauser .. [11]: http://sphinx.pocoo.org/ P.S.: I would like to get in touch with the IronPython developers as well, unfortunately I was not able to find a mailing list or IRC channel is there anybody how can send me in the right direction? ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/scopatz%40gmail.com ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe:
Re: [Python-Dev] Test cases not garbage collected after run
On 07/04/2011 20:18, Robert Collins wrote: On Fri, Apr 8, 2011 at 4:49 AM, Michael Foordfuzzy...@voidspace.org.uk wrote: You mean that the test run keeps the test instances alive for the whole test run so instance attributes are also kept alive. How would you solve this - by having calling a TestSuite (which is how a test run is executed) remove members from themselves after each test execution? (Any failure tracebacks etc stored by the TestResult would also have to not keep the test alive.) My only concern would be backwards compatibility due to the change in behaviour. An alternative is in TestCase.run() / TestCase.__call__(), make a copy and immediately delegate to it; that leaves the original untouched, permitting run-in-a-loop style helpers to still work. Testtools did something to address this problem, but I forget what it was offhand. That doesn't sound like a general solution as not everything is copyable and I don't think we should make that a requirement of tests. The proposed fix is to make test suite runs destructive, either replacing TestCase instances with None or pop'ing tests after they are run (the latter being what twisted Trial does). run-in-a-loop helpers could still repeatedly iterate over suites, just not call the suite. All the best, Michael -Rob -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On Thu, Apr 7, 2011 at 3:54 PM, Anthony Scopatz scop...@gmail.com wrote: Hi Daniel, Thanks for putting this together. I am a huge supporter of benchmarking efforts. My brief comment is below. On Wed, Apr 6, 2011 at 11:52 AM, DasIch dasdas...@googlemail.com wrote: 1. Definition of the benchmark suite. This will entail contacting developers of Python implementations (CPython, PyPy, IronPython and Jython), via discussion on the appropriate mailing lists. This might be achievable as part of this proposal. If you are reaching out to other projects at this stage, I think you should also be in touch with the Cython people (even if its 'implementation' sits on top of CPython). As a scientist/engineer what I care about is how Cython benchmarks to CPython. I believe that they have some ideas on benchmarking and have also explored this space. Their inclusion would be helpful to me thinking this GSoC successful at the end of the day (summer). Thanks for your consideration. Be Well Anthony Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] funky buildbot problems again...
My Intel Snow Leopard 2 build slave has gone into outer-space again. When I look at it, I see buildslave taking up most of a CPU (80%), and nothing much else going on. The twistd log says: [... much omitted ...] 2011-04-04 08:35:47-0700 [-] sending app-level keepalive 2011-04-04 08:45:47-0700 [-] sending app-level keepalive 2011-04-04 08:55:47-0700 [-] sending app-level keepalive 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] Lost connection to dinsdale.python.org:9020 2011-04-04 09:03:15-0700 [Broker,client] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 3 seconds 2011-04-04 09:03:15-0700 [Broker,client] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Connecting to dinsdale.python.org:9020 2011-04-04 09:03:18-0700 [Uninitialized] Connection to dinsdale.python.org:9020 failed: Connection Refused 2011-04-04 09:03:18-0700 [Uninitialized] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 8 seconds 2011-04-04 09:03:18-0700 [Uninitialized] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 So it's been spinning its wheels for 3 days. Sure looks like the connection attempt is failing, for some reason. I'm using the stock Twisted that comes with Snow Leopard -- tried to upgrade it but apparently can't. On my OS X 10.4 buildslave, I see a similar but more successful sequence: 2011-04-04 08:56:06-0700 [-] sending app-level keepalive 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] twisted.internet.tcp.Connector instance at 0x10352d8 will retry in 3 seconds 2011-04-04 09:04:39-0700 [Broker,client] Stopping factory buildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:42-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:43-0700 [Uninitialized] twisted.internet.tcp.Connector instance at 0x10352d8 will retry in 10 seconds 2011-04-04 09:04:43-0700 [Uninitialized] Stopping factory buildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:53-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:57-0700 [Broker,client] message from master: attached Bill ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On 06/04/2011 17:52, DasIch wrote: Hello Guys, I would like to present my proposal for the Google Summer of Code, concerning the idea of porting the benchmarks to Python 3.x for speed.pypy.org. I think I have successfully integrated the feedback I got from prior discussions on the topic and I would like to hear your opinion. [snip...] P.S.: I would like to get in touch with the IronPython developers as well, unfortunately I was not able to find a mailing list or IRC channel is there anybody how can send me in the right direction? This is the IronPython mailing list: http://lists.ironpython.com/listinfo.cgi/users-ironpython.com All the best, Michael Foord ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] funky buildbot problems again...
On 07/04/2011 21:31, Bill Janssen wrote: My Intel Snow Leopard 2 build slave has gone into outer-space again. [snip...] So it's been spinning its wheels for 3 days. Sure looks like the connection attempt is failing, for some reason. I'm using the stock Twisted that comes with Snow Leopard -- tried to upgrade it but apparently can't. You certainly shouldn't update the Twisted on your system Python. Can't you install Python 2.6 (from python.org) separately and install Twisted into that? Michael On my OS X 10.4 buildslave, I see a similar but more successful sequence: 2011-04-04 08:56:06-0700 [-] sending app-level keepalive 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client] lost remote 2011-04-04 09:04:39-0700 [Broker,client]twisted.internet.tcp.Connector instance at 0x10352d8 will retry in 3 seconds 2011-04-04 09:04:39-0700 [Broker,client] Stopping factorybuildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:42-0700 [-] Starting factorybuildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:43-0700 [Uninitialized]twisted.internet.tcp.Connector instance at 0x10352d8 will retry in 10 seconds 2011-04-04 09:04:43-0700 [Uninitialized] Stopping factorybuildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:53-0700 [-] Starting factorybuildslave.bot.BotFactory instance at 0x133bd78 2011-04-04 09:04:57-0700 [Broker,client] message from master: attached Bill ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 On 04/07/2011 04:28 PM, Jesse Noller wrote: On Thu, Apr 7, 2011 at 3:54 PM, Anthony Scopatz scop...@gmail.com wrote: Hi Daniel, Thanks for putting this together. I am a huge supporter of benchmarking efforts. My brief comment is below. On Wed, Apr 6, 2011 at 11:52 AM, DasIch dasdas...@googlemail.com wrote: 1. Definition of the benchmark suite. This will entail contacting developers of Python implementations (CPython, PyPy, IronPython and Jython), via discussion on the appropriate mailing lists. This might be achievable as part of this proposal. If you are reaching out to other projects at this stage, I think you should also be in touch with the Cython people (even if its 'implementation' sits on top of CPython). As a scientist/engineer what I care about is how Cython benchmarks to CPython. I believe that they have some ideas on benchmarking and have also explored this space. Their inclusion would be helpful to me thinking this GSoC successful at the end of the day (summer). Thanks for your consideration. Be Well Anthony Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Tres. - -- === Tres Seaver +1 540-429-0999 tsea...@palladion.com Palladion Software Excellence by Designhttp://palladion.com -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAk2eLWcACgkQ+gerLs4ltQ4R7wCgmam/W+3JzJRgxtehnnfbE54S RxcAn0ooO2kpw84kRvmTP5dCAWir9g3i =3mL7 -END PGP SIGNATURE- ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On Thu, 07 Apr 2011 17:32:24 -0400 Tres Seaver tsea...@palladion.com wrote: Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Agreed. Assuming someone wants to take care of the Cython side of things, I don't think there's any reason to exclude it under the dubious reason that it's not an interpreter. (would you exclude Psyco, if it was still alive?) Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy for making changes to the AST
2011/4/7 Maciej Fijalkowski fij...@gmail.com: AFAIK the AST is CPython-specific so should be treated with the same attitude as changes to the bytecode. That means, do it conservatively, since there *are* people who like to write tools that manipulate or analyze this, and while they know they're doing something CPython and version-specific, they should not be broken by bugfix releases, since the people who *use* their code probably have no idea of the deep magic they're depending on. PyPy implements exactly the same AST. I think Jython also does, although I'm not that sure. There were already issues with say subclassing ast nodes were pypy was incompatible from CPython. That said, it's completely fine from PyPy's perspective to change AST between major releases. Speaking as the author of PyPy's AST implementation, there are even some changes I'd like to make it easier! -- Regards, Benjamin ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] funky buildbot problems again...
Michael Foord fuzzy...@voidspace.org.uk wrote: On 07/04/2011 21:31, Bill Janssen wrote: My Intel Snow Leopard 2 build slave has gone into outer-space again. [snip...] So it's been spinning its wheels for 3 days. Sure looks like the connection attempt is failing, for some reason. I'm using the stock Twisted that comes with Snow Leopard -- tried to upgrade it but apparently can't. You certainly shouldn't update the Twisted on your system Python. Can't you install Python 2.6 (from python.org) separately and install Twisted into that? Apparently not. That's what I tried first -- install Python 2.7, and then the latest Twisted. Bill ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Policy for making changes to the AST
On Tue, Apr 5, 2011 at 6:37 AM, Nick Coghlan ncogh...@gmail.com wrote: 1. Making docstring an attribute of the Function node rather than leaving it embedded as the first statement in the suite (this avoids issues where AST-based constant folding could potentially corrupt the docstring) 2. Collapsing Num, Str, Bytes, Ellipsis into a single Literal node type (the handling of those nodes is the same in a lot of cases) 3. Since they're keywords now, pick up True, False, None at the parsing stage and turn them into instances of the Literal node type, allowing the current Name-based special casing to be removed. All of these sound like useful changes to me - I wouldn't want them blocked on Jython's account. We'll just implement them when we catch up to this version as far as I'm concerned. -Frank ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] funky buildbot problems again...
In article 4d9e2054.3080...@voidspace.org.uk, Michael Foord fuzzy...@voidspace.org.uk wrote: On 07/04/2011 21:31, Bill Janssen wrote: My Intel Snow Leopard 2 build slave has gone into outer-space again. [snip...] So it's been spinning its wheels for 3 days. Sure looks like the connection attempt is failing, for some reason. I'm using the stock Twisted that comes with Snow Leopard -- tried to upgrade it but apparently can't. You certainly shouldn't update the Twisted on your system Python. Can't you install Python 2.6 (from python.org) separately and install Twisted into that? +1 That should have no impact that I can think of on any buildbot testing as python.org framework builds are entirely self-contained. -- Ned Deily, n...@acm.org ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On 07/04/2011 22:41, Antoine Pitrou wrote: On Thu, 07 Apr 2011 17:32:24 -0400 Tres Seavertsea...@palladion.com wrote: Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Agreed. Assuming someone wants to take care of the Cython side of things, I don't think there's any reason to exclude it under the dubious reason that it's not an interpreter. (would you exclude Psyco, if it was still alive?) Well, sure - but within the scope of a GSOC project limiting it to core python seems like a more realistic goal. Adding cython later shouldn't be an issue if someone is willing to do the work. All the best, Michael Foord Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] funky buildbot problems again...
On 08:31 pm, jans...@parc.com wrote: My Intel Snow Leopard 2 build slave has gone into outer-space again. When I look at it, I see buildslave taking up most of a CPU (80%), and nothing much else going on. The twistd log says: [... much omitted ...] 2011-04-04 08:35:47-0700 [-] sending app-level keepalive 2011-04-04 08:45:47-0700 [-] sending app-level keepalive 2011-04-04 08:55:47-0700 [-] sending app-level keepalive 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] Lost connection to dinsdale.python.org:9020 2011-04-04 09:03:15-0700 [Broker,client] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 3 seconds 2011-04-04 09:03:15-0700 [Broker,client] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Connecting to dinsdale.python.org:9020 2011-04-04 09:03:18-0700 [Uninitialized] Connection to dinsdale.python.org:9020 failed: Connection Refused 2011-04-04 09:03:18-0700 [Uninitialized] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 8 seconds 2011-04-04 09:03:18-0700 [Uninitialized] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 So it's been spinning its wheels for 3 days. Does this mean that the 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 message in the logs is the last one you see until you restart the slave? Or does it mean that the logs go on and on for three days with these Connecting to dinsdale / Connection Refused / ... will retry in N seconds cycles, thousands and thousands of times? What does the buildmaster's info page for this slave say when the slave is in this state? In particular, what does it say about connects/hour? Jean-Paul ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On Thu, Apr 7, 2011 at 6:11 PM, Michael Foord fuzzy...@voidspace.org.ukwrote: On 07/04/2011 22:41, Antoine Pitrou wrote: On Thu, 07 Apr 2011 17:32:24 -0400 Tres Seavertsea...@palladion.com wrote: Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Agreed. Assuming someone wants to take care of the Cython side of things, I don't think there's any reason to exclude it under the dubious reason that it's not an interpreter. (would you exclude Psyco, if it was still alive?) Well, sure - but within the scope of a GSOC project limiting it to core python seems like a more realistic goal. Adding cython later shouldn't be an issue if someone is willing to do the work. Jesse, I understand that we are talking about the benchmarks on speed.pypy.org. The current suite, and correct me if I am wrong, is completely written in pure python so that any of the 'interpreters' may run them. My point, which I stand by, was that during the initial phase (where benchmarks are defined) that the Cython crowd should have a voice. This should have an enriching effect on the whole benchmarking task since they have thought about this issue in a way that is largely orthogonal to the methods PyPy developed. I think it would be a mistake to leave Cython out of the scoping study. I actually agree with Micheal. I think the onus of getting the benchmarks working on every platform is the onus of that interpreter's community. The benchmarking framework that is being developed as part of GSoC should be agile enough to add and drop projects over time and be able to make certain tests as 'known failures', etc. I don't think I am asking anything unreasonable here. Especially, since at the end of the day the purview of projects like PyPy and Cython (Make Python Faster) is the same. Be Well Anthony All the best, Michael Foord Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/scopatz%40gmail.com ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On 08/04/2011 00:36, Anthony Scopatz wrote: On Thu, Apr 7, 2011 at 6:11 PM, Michael Foord fuzzy...@voidspace.org.uk mailto:fuzzy...@voidspace.org.uk wrote: On 07/04/2011 22:41, Antoine Pitrou wrote: On Thu, 07 Apr 2011 17:32:24 -0400 Tres Seavertsea...@palladion.com mailto:tsea...@palladion.com wrote: Right now, we are talking about building speed.python.org http://speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Agreed. Assuming someone wants to take care of the Cython side of things, I don't think there's any reason to exclude it under the dubious reason that it's not an interpreter. (would you exclude Psyco, if it was still alive?) Well, sure - but within the scope of a GSOC project limiting it to core python seems like a more realistic goal. Adding cython later shouldn't be an issue if someone is willing to do the work. Jesse, I understand that we are talking about the benchmarks on speed.pypy.org http://speed.pypy.org. The current suite, and correct me if I am wrong, is completely written in pure python so that any of the 'interpreters' may run them. My point, which I stand by, was that during the initial phase (where benchmarks are defined) that the Cython crowd should have a voice. This should have an enriching effect on the whole benchmarking task since they have thought about this issue in a way that is largely orthogonal to the methods PyPy developed. I think it would be a mistake to leave Cython out of the scoping study. Personally I think the Gsoc project should just take the pypy suite and run with that - bikeshedding about what benchmarks to include is going to make it hard to make progress. We can have fun with that discussion once we have the infrastructure and *some* good benchmarks in place (and the pypy ones are good ones). So I'm still with Jesse on this one. If there is any discussion phase as part of the Gsoc project it should be very strictly bounded by time. All the best, Michael I actually agree with Micheal. I think the onus of getting the benchmarks working on every platform is the onus of that interpreter's community. The benchmarking framework that is being developed as part of GSoC should be agile enough to add and drop projects over time and be able to make certain tests as 'known failures', etc. I don't think I am asking anything unreasonable here. Especially, since at the end of the day the purview of projects like PyPy and Cython (Make Python Faster) is the same. Be Well Anthony All the best, Michael Foord Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org mailto:Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org mailto:Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/scopatz%40gmail.com -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] funky buildbot problems again...
exar...@twistedmatrix.com wrote: On 08:31 pm, jans...@parc.com wrote: My Intel Snow Leopard 2 build slave has gone into outer-space again. When I look at it, I see buildslave taking up most of a CPU (80%), and nothing much else going on. The twistd log says: [... much omitted ...] 2011-04-04 08:35:47-0700 [-] sending app-level keepalive 2011-04-04 08:45:47-0700 [-] sending app-level keepalive 2011-04-04 08:55:47-0700 [-] sending app-level keepalive 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] Lost connection to dinsdale.python.org:9020 2011-04-04 09:03:15-0700 [Broker,client] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 3 seconds 2011-04-04 09:03:15-0700 [Broker,client] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Connecting to dinsdale.python.org:9020 2011-04-04 09:03:18-0700 [Uninitialized] Connection to dinsdale.python.org:9020 failed: Connection Refused 2011-04-04 09:03:18-0700 [Uninitialized] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 8 seconds 2011-04-04 09:03:18-0700 [Uninitialized] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 So it's been spinning its wheels for 3 days. Does this mean that the 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 message in the logs is the last one you see until you restart the slave? Yes, that's the last line in the file. Or does it mean that the logs go on and on for three days with these Connecting to dinsdale / Connection Refused / ... will retry in N seconds cycles, thousands and thousands of times? Well, it's doing something, chewing up cycles, but there's only one Connecting line at the end of the log file. What does the buildmaster's info page for this slave say when the slave is in this state? In particular, what does it say about connects/hour? Ah, good question. Too bad I restarted the slave after I sent out my info. Is there some way to recover that from earlier? If not, it will undoubtedly fail again in a few days. Bill ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On Thu, Apr 7, 2011 at 6:52 PM, Michael Foord fuzzy...@voidspace.org.ukwrote: *some* good benchmarks in place (and the pypy ones are good ones). Agreed. The PyPy ones are good. So I'm still with Jesse on this one. If there is any discussion phase as part of the Gsoc project it should be very strictly bounded by time. I was simply going with what the abstract said. I am fine with discussion needing to be timely (a week?). But it seems that from what you are saying, just to be clear, Point (2) Implementation is also non-existent as the PyPy benchmarks already exist. If the point of the GSoC is to port the PyPy benchmarks to Python 3, under Point (3) Porting, might I suggest a slight revision of the proposal ;)? Be Well Anthony All the best, Michael I actually agree with Micheal. I think the onus of getting the benchmarks working on every platform is the onus of that interpreter's community. The benchmarking framework that is being developed as part of GSoC should be agile enough to add and drop projects over time and be able to make certain tests as 'known failures', etc. I don't think I am asking anything unreasonable here. Especially, since at the end of the day the purview of projects like PyPy and Cython (Make Python Faster) is the same. Be Well Anthony All the best, Michael Foord Regards Antoine. ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/fuzzyman%40voidspace.org.uk -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/scopatz%40gmail.com -- http://www.voidspace.org.uk/ May you do good and not evil May you find forgiveness for yourself and forgive others May you share freely, never taking more than you give. -- the sqlite blessing http://www.sqlite.org/different.html ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] funky buildbot problems again...
On 12:07 am, jans...@parc.com wrote: exar...@twistedmatrix.com wrote: On 08:31 pm, jans...@parc.com wrote: My Intel Snow Leopard 2 build slave has gone into outer-space again. When I look at it, I see buildslave taking up most of a CPU (80%), and nothing much else going on. The twistd log says: [... much omitted ...] 2011-04-04 08:35:47-0700 [-] sending app-level keepalive 2011-04-04 08:45:47-0700 [-] sending app-level keepalive 2011-04-04 08:55:47-0700 [-] sending app-level keepalive 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] lost remote 2011-04-04 09:03:15-0700 [Broker,client] Lost connection to dinsdale.python.org:9020 2011-04-04 09:03:15-0700 [Broker,client] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 3 seconds 2011-04-04 09:03:15-0700 [Broker,client] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:18-0700 [-] Connecting to dinsdale.python.org:9020 2011-04-04 09:03:18-0700 [Uninitialized] Connection to dinsdale.python.org:9020 failed: Connection Refused 2011-04-04 09:03:18-0700 [Uninitialized] twisted.internet.tcp.Connector instance at 0x101629ab8 will retry in 8 seconds 2011-04-04 09:03:18-0700 [Uninitialized] Stopping factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Starting factory buildslave.bot.BotFactory instance at 0x1016299e0 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 So it's been spinning its wheels for 3 days. Does this mean that the 2011-04-04 09:03:27-0700 [-] Connecting to dinsdale.python.org:9020 message in the logs is the last one you see until you restart the slave? Yes, that's the last line in the file. Or does it mean that the logs go on and on for three days with these Connecting to dinsdale / Connection Refused / ... will retry in N seconds cycles, thousands and thousands of times? Well, it's doing something, chewing up cycles, but there's only one Connecting line at the end of the log file. That's very interesting. It may be worth doing some gdb or dtrace investigation next time it gets into this state. What does the buildmaster's info page for this slave say when the slave is in this state? In particular, what does it say about connects/hour? Ah, good question. Too bad I restarted the slave after I sent out my info. Is there some way to recover that from earlier? If not, it will undoubtedly fail again in a few days. If the master logs are available, that would provide some information. Otherwise, I think waiting for it to happen again is the thing to do. Since there were no other messages in the log file, I expect the connects/hour value will be low - perhaps 0. Jean-Paul ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Code highlighting in tracker
Because tracker is ugly. Is this an unbiased opinion? :) Eugene ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Test cases not garbage collected after run
On Fri, Apr 8, 2011 at 8:12 AM, Michael Foord fuzzy...@voidspace.org.uk wrote: On 07/04/2011 20:18, Robert Collins wrote: On Fri, Apr 8, 2011 at 4:49 AM, Michael Foordfuzzy...@voidspace.org.uk wrote: You mean that the test run keeps the test instances alive for the whole test run so instance attributes are also kept alive. How would you solve this - by having calling a TestSuite (which is how a test run is executed) remove members from themselves after each test execution? (Any failure tracebacks etc stored by the TestResult would also have to not keep the test alive.) My only concern would be backwards compatibility due to the change in behaviour. An alternative is in TestCase.run() / TestCase.__call__(), make a copy and immediately delegate to it; that leaves the original untouched, permitting run-in-a-loop style helpers to still work. Testtools did something to address this problem, but I forget what it was offhand. That doesn't sound like a general solution as not everything is copyable and I don't think we should make that a requirement of tests. The proposed fix is to make test suite runs destructive, either replacing TestCase instances with None or pop'ing tests after they are run (the latter being what twisted Trial does). run-in-a-loop helpers could still repeatedly iterate over suites, just not call the suite. Thats quite expensive - repeating discovery etc from scratch. If you don't repeat discovery then you're assuming copyability. What I suggested didn't /require/ copying - it delegates it to the test, an uncopyable test would simply not do this. -Rob ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On Thu, Apr 7, 2011 at 7:52 PM, Michael Foord fuzzy...@voidspace.org.uk wrote: On 08/04/2011 00:36, Anthony Scopatz wrote: On Thu, Apr 7, 2011 at 6:11 PM, Michael Foord fuzzy...@voidspace.org.uk wrote: On 07/04/2011 22:41, Antoine Pitrou wrote: On Thu, 07 Apr 2011 17:32:24 -0400 Tres Seavertsea...@palladion.com wrote: Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Agreed. Assuming someone wants to take care of the Cython side of things, I don't think there's any reason to exclude it under the dubious reason that it's not an interpreter. (would you exclude Psyco, if it was still alive?) Well, sure - but within the scope of a GSOC project limiting it to core python seems like a more realistic goal. Adding cython later shouldn't be an issue if someone is willing to do the work. Jesse, I understand that we are talking about the benchmarks on speed.pypy.org. The current suite, and correct me if I am wrong, is completely written in pure python so that any of the 'interpreters' may run them. My point, which I stand by, was that during the initial phase (where benchmarks are defined) that the Cython crowd should have a voice. This should have an enriching effect on the whole benchmarking task since they have thought about this issue in a way that is largely orthogonal to the methods PyPy developed. I think it would be a mistake to leave Cython out of the scoping study. Personally I think the Gsoc project should just take the pypy suite and run with that - bikeshedding about what benchmarks to include is going to make it hard to make progress. We can have fun with that discussion once we have the infrastructure and *some* good benchmarks in place (and the pypy ones are good ones). So I'm still with Jesse on this one. If there is any discussion phase as part of the Gsoc project it should be very strictly bounded by time. What michael said: My goal is is to get speed.pypy.org ported to be able to be used by $N interpreters, for $Y sets of performance numbers. I'm trying to constrain the problem, and the initial deployment so we don't spend the next year meandering about. It should be sufficient to port the benchmarks from speed.pypy.org, and any deltas from http://hg.python.org/benchmarks/ to Python 3 and the framework that runs the tests to start. I don't care if we eventually run cython, psyco, parrot, etc. But the focus at the language summit, and the continued focus of me getting the hardware via the PSF to host this on performance/speed.python.org is tightly focused on the pypy, ironpython, jython and cpython interpreters. Let's just get our basics done first before we go all crazy with adding stuff :) jesse ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
[Python-Dev] abstractmethod doesn't work in classes
Hello, I've found that abstractmethod and similar decorators don't work in classes, inherited from built-in types other than object. For example: import abc class MyBase(metaclass=abc.ABCMeta): @abc.abstractmethod def foo(): pass MyBase() Traceback (most recent call last): File pyshell#8, line 1, in module MyBase() TypeError: Can't instantiate abstract class MyBase with abstract methods foo So far so good, but: class MyList(list, MyBase): pass MyList() [] MyList.__abstractmethods__ frozenset({'foo'}) This is unexpected, since MyList still doesn't implement foo. Should this be considered a bug? I don't see this in documentation. The underlying reason is that __abstractmethods__ is checked in object_new, but built-in types typically call tp_alloc directly, thus skipping the check. Eugene ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] [GSoC] Developing a benchmark suite (for Python 3.x)
On Thu, Apr 7, 2011 at 8:29 PM, Jesse Noller jnol...@gmail.com wrote: On Thu, Apr 7, 2011 at 7:52 PM, Michael Foord fuzzy...@voidspace.org.uk wrote: On 08/04/2011 00:36, Anthony Scopatz wrote: On Thu, Apr 7, 2011 at 6:11 PM, Michael Foord fuzzy...@voidspace.org.uk wrote: On 07/04/2011 22:41, Antoine Pitrou wrote: On Thu, 07 Apr 2011 17:32:24 -0400 Tres Seavertsea...@palladion.com wrote: Right now, we are talking about building speed.python.org to test the speed of python interpreters, over time, and alongside one another - cython *is not* an interpreter. Cython is out of scope for this. Why is it out of scope to use the benchmarks and test harness to answer questions like can we use Cython to provide optional optimizations for the stdlib? I can certainly see value in havng an objective way to compare the macro benchmark performance of a Cython-optimized CPython vs. a vanilla CPython, as well as vs. PyPY, Jython, or IronPython. Agreed. Assuming someone wants to take care of the Cython side of things, I don't think there's any reason to exclude it under the dubious reason that it's not an interpreter. (would you exclude Psyco, if it was still alive?) Well, sure - but within the scope of a GSOC project limiting it to core python seems like a more realistic goal. Adding cython later shouldn't be an issue if someone is willing to do the work. Jesse, I understand that we are talking about the benchmarks on speed.pypy.org. The current suite, and correct me if I am wrong, is completely written in pure python so that any of the 'interpreters' may run them. My point, which I stand by, was that during the initial phase (where benchmarks are defined) that the Cython crowd should have a voice. This should have an enriching effect on the whole benchmarking task since they have thought about this issue in a way that is largely orthogonal to the methods PyPy developed. I think it would be a mistake to leave Cython out of the scoping study. Personally I think the Gsoc project should just take the pypy suite and run with that - bikeshedding about what benchmarks to include is going to make it hard to make progress. We can have fun with that discussion once we have the infrastructure and *some* good benchmarks in place (and the pypy ones are good ones). So I'm still with Jesse on this one. If there is any discussion phase as part of the Gsoc project it should be very strictly bounded by time. What michael said: My goal is is to get speed.pypy.org ported to be able to be used by $N interpreters, for $Y sets of performance numbers. I'm trying to constrain the problem, and the initial deployment so we don't spend the next year meandering about. It should be sufficient to port the benchmarks from speed.pypy.org, and any deltas from http://hg.python.org/benchmarks/ to Python 3 and the framework that runs the tests to start. I don't care if we eventually run cython, psyco, parrot, etc. But the focus at the language summit, and the continued focus of me getting the hardware via the PSF to host this on performance/speed.python.org is tightly focused on the pypy, ironpython, jython and cpython interpreters. Let's just get our basics done first before we go all crazy with adding stuff :) Ahh gotcha, I think I misunderstood the scope in the short term ;). Be Well Anthony jesse ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] abstractmethod doesn't work in classes
I've found that abstractmethod and similar decorators don't work in classes, inherited from built-in types other than object. http://bugs.python.org/issue5996 ___ Python-Dev mailing list Python-Dev@python.org http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com