Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Vinay Sajip
Georg Brandl  gmx.net> writes:

> 
> For the impatient: the result can be seen at .
> 
> - the toolchain is pure Python, therefore can easily be shipped
> 

Very nice! As well as looking very attractive and professional, the all-Python
toolset should make it easier to build the documentation - I've not been able to
get a trouble-free setup of the docs toolchain on Windows.

Thanks for this,

Vinay Sajip

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] buildbot failure in x86 W2k trunk

2007-05-20 Thread Brett Cannon

For removing extension modules from the build process on Windows, do you
just delete the File entry from PCbuild/pythoncore.vcproj?

-Brett

On 5/20/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:


The Buildbot has detected a new failure of x86 W2k trunk.
Full details are available at:
http://www.python.org/dev/buildbot/all/x86%2520W2k%2520trunk/builds/290

Buildbot URL: http://www.python.org/dev/buildbot/all/

Build Reason:
Build Source Stamp: [branch trunk] HEAD
Blamelist: brett.cannon

BUILD FAILED: failed compile

sincerely,
-The Buildbot

___
Python-checkins mailing list
[EMAIL PROTECTED]
http://mail.python.org/mailman/listinfo/python-checkins

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-3000] PEP 367: New Super

2007-05-20 Thread Nick Coghlan
Tim Delaney wrote:
> So the question is, should the method store the class, or the name? Looking 
> up by name could pick up a totally unrelated class, but storing the 
> undecorated class could miss something important in the decoration.

Couldn't we provide a mechanism whereby the cell can be adjusted to 
point to the decorated class? (heck, the interpreter has access to both 
classes after execution of the class statement - it could probably 
arrange for this to happen automatically whenever the decorated and 
undecorated classes are different).

Cheers,
Nick.

-- 
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---
 http://www.boredomandlaziness.org
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-checkins] buildbot failure in x86 W2k trunk

2007-05-20 Thread Martin v. Löwis
Brett Cannon schrieb:
> For removing extension modules from the build process on Windows, do you
> just delete the File entry from PCbuild/pythoncore.vcproj?

No, you also need to remove the entry from PC/config.c.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-3000] PEP 367: New Super

2007-05-20 Thread Tim Delaney
Nick Coghlan wrote:
> Tim Delaney wrote:
>> So the question is, should the method store the class, or the name?
>> Looking up by name could pick up a totally unrelated class, but
>> storing the undecorated class could miss something important in the
>> decoration. 
> 
> Couldn't we provide a mechanism whereby the cell can be adjusted to
> point to the decorated class? (heck, the interpreter has access to
> both classes after execution of the class statement - it could
> probably arrange for this to happen automatically whenever the
> decorated and undecorated classes are different).

Yep - I thought of that. I think that's probably the right way to go.

Tim Delaney
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Summary of Tracker Issues

2007-05-20 Thread Josiah Carlson

Talin <[EMAIL PROTECTED]> wrote:
> Josiah Carlson wrote:
> > Captchas like this are easily broken using computational methods, or
> > even the porn site trick that was already mentioned.  Never mind
> > Stephen's stated belief, that you quoted, that he believes that even the
> > hard captchas are going to be beaten by computational methods soon.  Please
> > try to pay attention to previous posts.
> 
> I think people are trying too hard here - in other words, they are 
> putting more of computational science brainpower into the problem than 
> it really merits. While it is true that there is an arms race between 
> creators of social software applications and spammers, this arms race is 
> only waged the largest scales - spammers simply won't spend the effort 
> to go after individual sites, its not cost effective, especially when 
> there are much more lucrative targets.

My point was that spending time to come up with a "better" captcha in
attempt to thwart spammers was ill advised, in particular because others
brought up varous reasons why captchas weren't the way to go.


 - Josiah

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Georg Brandl
[warning: bulk answer ahead]

First, thanks for all those nice comments!

[John Gabriele]
 > BTW, would like to see a little blurb of your own on that page about
 > how the docs were converted, rendered, and their new source format.

Sure. I've already written part of the new "Documenting Python" docs, which
cover this a bit. The "About the documentation" will be rewritten too.

[Lea Wiemann]
 > I'd reeeally like to have a look at the source code (don't worry if it's
 > not clean!).  Can you publish or post it somewhere?  If you'd like to
 > store it in the Docutils sandboxes, just drop me a line and I'll give
 > you SVN access.  By the way, things get a lot easier for me if you place
 > it in the public domain, because that's the license Docutils uses, and
 > it's obviously compatible to every other license.

The toolset is now in the Docutils sandbox at
.

 > I actually have a Google Summer of Code project, "Documenting Python
 > Packages with Docutils", which I'll start working on May 28:
 > .
 > It has a somewhat different scope, so our projects will complement each
 > other nicely I believe.  (To the point where we may end up with a
 > complete tool-chain to actually migrate the Python documentation to
 > reST.  Très cool.)

Great! Making the new toolset usable for third-party developers is certainly
a good option. I saw quite a few using the LaTeX-based tools too..

[Ron Adam]
 > I've been working on improving pydoc. (slowly but steadily) Maybe we can
 > join efforts as I think the two projects overlap in a number of areas, and
 > it sounds like we are thinking of some of the same things as far as the
 > tool chain.  So maybe there's some synergy we can take advantage of.

Certainly there's plenty of overlap.

 > It looks like there may be a few areas where we can share code.
 >
 > - The html syntax highlighters.   (Pydoc can use those)

The highlighting is actually done with Pygments, which cannot be included
in the stdlib as-is. Perhaps a stripped-down version?

 > - A shared html style sheet.
 > - Document locater.  [1]
 > - An HTMLServer for local (dynamic dispatching) html viewing. [2]
 > - The reST source for viewing topics and keywords in pydoc.
 >   (Instead of scraping html pages. Ick)

Yes, that makes sense. If you want to coordinate efforts, feel free to
contact me at Jabber <[EMAIL PROTECTED]>.

[Ka-Ping Yee]
 > I agree that interactivity (online commenting and editing) will
 > be a huge advantage.

 > I could imagine this heading in a Wiki-like direction -- where a
 > particular version is tagged as the official revision shipped
 > with a particular Python release, but everyone can make edits
 > online to yield new versions, eventually yielding the revision
 > that will be released with the next Python release.

Yes. I think that always only the latest version should be "publicly"
interactive. Old archived doc versions should be static only.

 > I haven't looked at it in depth yet, but I have a question.  One
 > concern from a long thread on Doc-Sig a long time ago, is that ReST
 > did not at the time possess the ability to nicely markup the objects
 > as LaTeX macros do.   Is your transformation losing markup information
 > from the original docs?  e.g. are you still marking classes as classes
 > and functions as functions in the ReST source, or is it converting
 > from qualified markup to "style" markup (e.g., to generic literals
 > instead of class/function/variable/keyword argument docutils roles,
 > etc.).If you solved that problem, how did you solve it?  Is the
 > resulting ReST pretty?  Do you think we can build a better index?

As Steven said, it's solved quite nicely with interpreted text roles.
Whether ":class:`Foo`" is nicer than "\class{Foo}" is not entirely clear ;)
but you actually get more now, since if a class "Foo" is found in the
namespace, it will be cross-linked automatically.

About the index: I didn't do anything about it. I just transferred the
LaTeX commands into reST directives, the rest is generated completely
analogous.

 > Very nice! As well as looking very attractive and professional, the 
 > all-Python
 > toolset should make it easier to build the documentation - I've not been
 > able to get a trouble-free setup of the docs toolchain on Windows.

Yep. As it is now, you need three packages from the Cheese Shop:
Docutils, Pygments (the highlighter) and Jinja (the templating engine).
This shouldn't be problematic, though they could also be stripped down
and included.

Cheers,
Georg


-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

__

Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Georg Brandl
Vinay Sajip schrieb:
> Georg Brandl  gmx.net> writes:
> 
>> 
>> For the impatient: the result can be seen at .
>> 
>> - the toolchain is pure Python, therefore can easily be shipped
>> 
> 
> Very nice! As well as looking very attractive and professional, [...]

BTW, I have to give lots of credit for the looks to Armin Ronacher. I'm
not so much of a designer ;)

Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Georg Brandl
[warning: bulk answer ahead]

First, thanks for all those nice comments!

[John Gabriele]
 > BTW, would like to see a little blurb of your own on that page about
 > how the docs were converted, rendered, and their new source format.

Sure. I've already written part of the new "Documenting Python" docs, which
cover this a bit. The "About the documentation" will be rewritten too.

[Lea Wiemann]
 > I'd reeeally like to have a look at the source code (don't worry if it's
 > not clean!).  Can you publish or post it somewhere?  If you'd like to
 > store it in the Docutils sandboxes, just drop me a line and I'll give
 > you SVN access.  By the way, things get a lot easier for me if you place
 > it in the public domain, because that's the license Docutils uses, and
 > it's obviously compatible to every other license.

The toolset is now in the Docutils sandbox at
.

 > I actually have a Google Summer of Code project, "Documenting Python
 > Packages with Docutils", which I'll start working on May 28:
 > .
 > It has a somewhat different scope, so our projects will complement each
 > other nicely I believe.  (To the point where we may end up with a
 > complete tool-chain to actually migrate the Python documentation to
 > reST.  Très cool.)

Great! Making the new toolset usable for third-party developers is certainly
a good option. I saw quite a few using the LaTeX-based tools too..

[Ron Adam]
 > I've been working on improving pydoc. (slowly but steadily) Maybe we can
 > join efforts as I think the two projects overlap in a number of areas, and
 > it sounds like we are thinking of some of the same things as far as the
 > tool chain.  So maybe there's some synergy we can take advantage of.

Certainly there's plenty of overlap.

 > It looks like there may be a few areas where we can share code.
 >
 > - The html syntax highlighters.   (Pydoc can use those)

The highlighting is actually done with Pygments, which cannot be included
in the stdlib as-is. Perhaps a stripped-down version?

 > - A shared html style sheet.
 > - Document locater.  [1]
 > - An HTMLServer for local (dynamic dispatching) html viewing. [2]
 > - The reST source for viewing topics and keywords in pydoc.
 >   (Instead of scraping html pages. Ick)

Yes, that makes sense. If you want to coordinate efforts, feel free to
contact me at Jabber <[EMAIL PROTECTED]>.

[Ka-Ping Yee]
 > I agree that interactivity (online commenting and editing) will
 > be a huge advantage.

 > I could imagine this heading in a Wiki-like direction -- where a
 > particular version is tagged as the official revision shipped
 > with a particular Python release, but everyone can make edits
 > online to yield new versions, eventually yielding the revision
 > that will be released with the next Python release.

Yes. I think that always only the latest version should be "publicly"
interactive. Old archived doc versions should be static only.

[Martin Blais]
 > I haven't looked at it in depth yet, but I have a question.  One
 > concern from a long thread on Doc-Sig a long time ago, is that ReST
 > did not at the time possess the ability to nicely markup the objects
 > as LaTeX macros do.   Is your transformation losing markup information
 > from the original docs?  e.g. are you still marking classes as classes
 > and functions as functions in the ReST source, or is it converting
 > from qualified markup to "style" markup (e.g., to generic literals
 > instead of class/function/variable/keyword argument docutils roles,
 > etc.).If you solved that problem, how did you solve it?  Is the
 > resulting ReST pretty?  Do you think we can build a better index?

As Steven said, it's solved quite nicely with interpreted text roles.
Whether ":class:`Foo`" is nicer than "\class{Foo}" is not entirely clear ;)
but you actually get more now, since if a class "Foo" is found in the
namespace, it will be cross-linked automatically.

About the index: I didn't do anything about it. I just transferred the
LaTeX commands into reST directives, the rest is generated completely
analogous.

 > Very nice! As well as looking very attractive and professional, the 
 > all-Python
 > toolset should make it easier to build the documentation - I've not been
 > able to get a trouble-free setup of the docs toolchain on Windows.

Yep. As it is now, you need three packages from the Cheese Shop:
Docutils, Pygments (the highlighter) and Jinja (the templating engine).
This shouldn't be problematic, though they could also be stripped down
and included.

Cheers,
Georg


-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

Re: [Python-Dev] [Doc-SIG] The docs, reloaded

2007-05-20 Thread Lea Wiemann
[Georg Brandl]
> The highlighting is actually done with Pygments, which cannot be
> included in the stdlib as-is. Perhaps a stripped-down version?

No need to; we can just fall back to no syntax highlighting if Pygments
is not installed on the user's system.

[Gael Varoquaux]
>> - The html syntax highlighters.   (Pydoc can use those)
>
> I have a patch on the docutils patch tracker that does this.

For everyone's reference,
.

Best wishes,

Lea
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Summary of Tracker Issues

2007-05-20 Thread Scott Dial
Terry Reedy wrote:
> Why not simply embargo any post with an off-site link?  Tho there might 
> have been some, I can't remember a single example of such at SF.

I have often posted links off-site because the SF tracker didn't allow 
unrelated parties to attach things. I don't know whether the new tracker 
will allow that, but if it doesn't, you will again see links off-site.

-Scott

-- 
Scott Dial
[EMAIL PROTECTED]
[EMAIL PROTECTED]
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Martin Blais
On 5/20/07, Georg Brandl <[EMAIL PROTECTED]> wrote:
>  > Very nice! As well as looking very attractive and professional, the 
> all-Python
>  > toolset should make it easier to build the documentation - I've not been
>  > able to get a trouble-free setup of the docs toolchain on Windows.
>
> Yep. As it is now, you need three packages from the Cheese Shop:
> Docutils, Pygments (the highlighter) and Jinja (the templating engine).
> This shouldn't be problematic, though they could also be stripped down
> and included.

This is great.  IMHO if this is to compete to become the official
Python docs, I would argue for even less dependencies, even at the
cost of more generic/bland output, for portability reasons and to
stimulate greater adoption.  If we can make some of those dependencies
optional and only rely on docutils, that could make it ubiquitous.

Another thing to keep in mind:  I don't know if the directives you
defined are very generic, but if they are, it would be interesting to
consider migrating them up into docutils (if it makes sense), and see
if they could support documenting other programming languages.  Could
this be a language-independent documenting toolkit?  Could we document
LISP or Ruby code with it?

Georg, thanks again!
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Neal Becker
Sounds very interesting.  I just have one concern/question.  I hope that
while moving away from latex, we are not precluding the ability to write
math as part of the documentation.  What would be my choices for add math
to the documentation?  Hopefully using latex, since there really isn't
AFAIK any other competitor for this.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Scott Dial
Neal Becker wrote:
> Sounds very interesting.  I just have one concern/question.  I hope that
> while moving away from latex, we are not precluding the ability to write
> math as part of the documentation.  What would be my choices for add math
> to the documentation?  Hopefully using latex, since there really isn't
> AFAIK any other competitor for this.
> 

Where in the current documentation is there any math notation /at all/? 
In all my reading of it, I have not run across anything that appeared 
like it was being used. Besides that question, is the full power of 
LaTeX math notation really necessary here? I somehow doubt anything more 
than simple expressions of runtime performance and container behaviors 
are appropriate for any documentation we have.

-Scott

-- 
Scott Dial
[EMAIL PROTECTED]
[EMAIL PROTECTED]
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Introduction and request for commit access to the sandbox.

2007-05-20 Thread Alexandre Vassalotti

Hello,

As some of you may already know, I will be working on Python for this
year Google Summer of Code. My project is to merge the modules with a
dual C and Python implementation, i.e. cPickle/pickle,
cStringIO/StringIO and cProfile/profile [1]. This project is part of
the standard library reorganization for Python 3000 [2]. And my mentor
for this project is Brett Cannon.

So first, let me introduce myself. I am currently a student from
Quebec, Canada. I plan to make a career as a (hopefully good)
programmer. Therefore, I dedicate a lot of my free time contributing
to open source projects, like Ubuntu. I, recently, became interested
by how compilers and interpreters work. So, I started reading Python's
source code, which is one of the most well organized and comprehensive
code base I have seen. This motivated me to start contributing to
Python. However since school kept me fairly busy, I haven't had the
chance to do anything other than providing support to Python's users
in the #python FreeNode IRC channel. This year Summer of Code will
give me the chance to do a significant contribution to Python, and to
get started with Python code development as well.

With that said, I would to request svn access to the sandbox for my
work. I will use this access only for modifying stuff in the directory
I will be assigned to. I would like to use the username "avassalotti"
and the attached SSH2 public key for this access.

One last thing, if you know semantic differences (other than the
obvious ones) between the C and Python versions of the modules I need
to merge, please let know. This will greatly simplify the merge and
reduce the chances of later breaking.

Cheers,
-- Alexandre

.. [1] Abstract of my application, Merge the C and Python
implementations of the same interface
  (http://code.google.com/soc/psf/appinfo.html?csaid=C6768E09BEF7CCE2)
.. [2] PEP 3108, Standard Library Reorganization, Cannon
  (http://www.python.org/dev/peps/pep-3108)


id_dsa.pub
Description: Binary data
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Georg Brandl
Scott Dial schrieb:
> Neal Becker wrote:
>> Sounds very interesting.  I just have one concern/question.  I hope that
>> while moving away from latex, we are not precluding the ability to write
>> math as part of the documentation.  What would be my choices for add math
>> to the documentation?  Hopefully using latex, since there really isn't
>> AFAIK any other competitor for this.
>> 
> 
> Where in the current documentation is there any math notation /at all/? 
> In all my reading of it, I have not run across anything that appeared 
> like it was being used. Besides that question, is the full power of 
> LaTeX math notation really necessary here? I somehow doubt anything more 
> than simple expressions of runtime performance and container behaviors 
> are appropriate for any documentation we have.

There is exactly one instance of LaTeX math in the whole docs, it's in the
description of audioop, AFAIR, an contains a sum over square roots...

So, that's not really a concern of mine ;)

Georg

-- 
Thus spake the Lord: Thou shalt indent with four spaces. No more, no less.
Four shall be the number of spaces thou shalt indent, and the number of thy
indenting shall be four. Eight shalt thou not indent, nor either indent thou
two, excepting that thou then proceed to four. Tabs are right out.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Terry Reedy
Please add a link to the PEP index (which is also missing from 
docs.python.org, though not from python.org/doc/.

And consider at least some PEPs as part of the corpus indexed (ie, those 
with info not in the regular docs).

tjr



___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 0365: Adding the pkg_resources module

2007-05-20 Thread Talin
Phillip J. Eby wrote:
> I wanted to get this in before the Py3K PEP deadline, since this is a 
> Python 2.6 PEP that would presumably impact 3.x as well.  Feedback welcome.
> 
> 
> PEP: 365
> Title: Adding the pkg_resources module
> Version: $Revision: 55032 $
> Last-Modified: $Date: 2007-04-30 20:24:48 -0400 (Mon, 30 Apr 2007) $
> Author: Phillip J. Eby <[EMAIL PROTECTED]>
> Status: Draft
> Type: Standards Track
> Content-Type: text/x-rst
> Created: 30-Apr-2007
> Post-History: 30-Apr-2007
> 
> 
> Abstract
> 
> 
> This PEP proposes adding an enhanced version of the ``pkg_resources``
> module to the standard library.
> 
> ``pkg_resources`` is a module used to find and manage Python
> package/version dependencies and access bundled files and resources,
> including those inside of zipped ``.egg`` files.  Currently,
> ``pkg_resources`` is only available through installing the entire
> ``setuptools`` distribution, but it does not depend on any other part
> of setuptools; in effect, it comprises the entire runtime support
> library for Python Eggs, and is independently useful.
> 
> In addition, with one feature addition, this module could support
> easy bootstrap installation of several Python package management
> tools, including ``setuptools``, ``workingenv``, and ``zc.buildout``.
> 
> 
> Proposal
> 
> 
> Rather than proposing to include ``setuptools`` in the standard
> library, this PEP proposes only that ``pkg_resources`` be added to the
> standard library for Python 2.6 and 3.0.  ``pkg_resources`` is
> considerably more stable than the rest of setuptools, with virtually
> no new features being added in the last 12 months.
> 
> However, this PEP also proposes that a new feature be added to
> ``pkg_resources``, before being added to the stdlib.  Specifically, it
> should be possible to do something like::
> 
>  python -m pkg_resources SomePackage==1.2
> 
> to request downloading and installation of ``SomePackage`` from PyPI.
> This feature would *not* be a replacement for ``easy_install``;
> instead, it would rely on ``SomePackage`` having pure-Python ``.egg``
> files listed for download via the PyPI XML-RPC API, and the eggs would
> be placed in the ``$PYTHONEGGS`` cache, where they would **not** be
> importable by default.  (And no scripts would be installed)  However,
> if the download egg contains installation bootstrap code, it will be
> given a chance to run.
> 
> These restrictions would allow the code to be extremely simple, yet
> still powerful enough to support users downloading package management
> tools such as ``setuptools``, ``workingenv`` and ``zc.buildout``,
> simply by supplying the tool's name on the command line.
> 
> 
> Rationale
> =
> 
> Many users have requested that ``setuptools`` be included in the
> standard library, to save users needing to go through the awkward
> process of bootstrapping it.  However, most of the bootstrapping
> complexity comes from the fact that setuptools-installed code cannot
> use the ``pkg_resources`` runtime module unless setuptools is already
> installed. Thus, installing setuptools requires (in a sense) that
> setuptools already be installed.
> 
> Other Python package management tools, such as ``workingenv`` and
> ``zc.buildout``, have similar bootstrapping issues, since they both
> make use of setuptools, but also want to provide users with something
> approaching a "one-step install".  The complexity of creating bootstrap
> utilities for these and any other such tools that arise in future, is
> greatly reduced if ``pkg_resources`` is already present, and is also
> able to download pre-packaged eggs from PyPI.
> 
> (It would also mean that setuptools would not need to be installed
> in order to simply *use* eggs, as opposed to building them.)
> 
> Finally, in addition to providing access to eggs built via setuptools
> or other packaging tools, it should be noted that since Python 2.5,
> the distutils install package metadata (aka ``PKG-INFO``) files that
> can be read by ``pkg_resources`` to identify what distributions are
> already on ``sys.path``.  In environments where Python packages are
> installed using system package tools (like RPM), the ``pkg_resources``
> module provides an API for detecting what versions of what packages
> are installed, even if those packages were installed via the distutils
> instead of setuptools.
> 
> 
> Implementation and Documentation
> 
> 
> The ``pkg_resources`` implementation is maintained in the Python
> SVN repository under ``/sandbox/trunk/setuptools/``; see
> ``pkg_resources.py`` and ``pkg_resources.txt``.  Documentation for the
> egg format(s) supported by ``pkg_resources`` can be found in
> ``doc/formats.txt``.  HTML versions of these documents are available
> at:
> 
> * http://peak.telecommunity.com/DevCenter/PkgResources and
> 
> * http://peak.telecommunity.com/DevCenter/EggFormats
> 
> (These HTML versions are for setuptools 0.6; they may not reflect 

Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Neal Becker
Georg Brandl wrote:

> Scott Dial schrieb:
>> Neal Becker wrote:
>>> Sounds very interesting.  I just have one concern/question.  I hope that
>>> while moving away from latex, we are not precluding the ability to write
>>> math as part of the documentation.  What would be my choices for add
>>> math
>>> to the documentation?  Hopefully using latex, since there really isn't
>>> AFAIK any other competitor for this.
>>> 
>> 
>> Where in the current documentation is there any math notation /at all/?
>> In all my reading of it, I have not run across anything that appeared
>> like it was being used. Besides that question, is the full power of
>> LaTeX math notation really necessary here? I somehow doubt anything more
>> than simple expressions of runtime performance and container behaviors
>> are appropriate for any documentation we have.
> 
> There is exactly one instance of LaTeX math in the whole docs, it's in the
> description of audioop, AFAIR, an contains a sum over square roots...
> 
> So, that's not really a concern of mine ;)
> 
> Georg
> 

There is an effort as part of numpy to come up with a new system using
docstrings.  It seems to me it would be unfortunate if these two efforts
were not coordinated.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Robert Kern
Neal Becker wrote:

> There is an effort as part of numpy to come up with a new system using
> docstrings.  It seems to me it would be unfortunate if these two efforts
> were not coordinated.

I don't think so. The issue with numpy is getting our act together and making
parseable docstrings for auto-generated API documentation using existing tools
or slightly modified versions thereof. No one is actually contemplating building
a new tool. AFAICT, Georg's (excellent) work doesn't address that use. I don't
think there is anything to coordinate, here. Provided that Georg's system
doesn't place too many restrictions on the reST it handles, we could use the
available reST math options if we wanted to use Georg's system.

I'd much rather see Georg spend his time working on the docs for the Python
language and the feature set it requires. If the numpy project needs to extend
that feature set, we'll provide the manpower ourselves.

-- 
Robert Kern

"I have come to believe that the whole world is an enigma, a harmless enigma
 that is made terrible by our own mad attempt to interpret it as though it had
 an underlying truth."
  -- Umberto Eco

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-3000] PEP 367: New Super

2007-05-20 Thread Phillip J. Eby
At 04:25 PM 5/20/2007 +1000, Tim Delaney wrote:
>I'm not sure what you're getting at here - are you referring to the 
>decorators for classes PEP? In that case, the decorator is applied 
>after the class is constructed, so it would be the undecorated class.
>
>Are class decorators going to update the MRO? I see nothing about 
>that in PEP 3129, so using the undecorated class would match the 
>current super(cls, self) behaviour.

Class decorators can (and sometimes *do*, in PEAK) return an object 
that's not the original class object.  So that would break super, 
which is why my inclination is to go with using the decorated result.

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Python-3000] PEP 367: New Super

2007-05-20 Thread Phillip J. Eby
At 06:20 PM 5/20/2007 +1000, Tim Delaney wrote:
>Nick Coghlan wrote:
> > Tim Delaney wrote:
> >> So the question is, should the method store the class, or the name?
> >> Looking up by name could pick up a totally unrelated class, but
> >> storing the undecorated class could miss something important in the
> >> decoration.
> >
> > Couldn't we provide a mechanism whereby the cell can be adjusted to
> > point to the decorated class? (heck, the interpreter has access to
> > both classes after execution of the class statement - it could
> > probably arrange for this to happen automatically whenever the
> > decorated and undecorated classes are different).
>
>Yep - I thought of that. I think that's probably the right way to go.

Btw, PEP 3124 needs a way to receive the same class object at more or 
less the same moment, although in the form of a callback rather than 
a cell assignment.  Guido suggested I co-ordinate with you to design 
a mechanism for this.


___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Py2.6 buildouts to the set API

2007-05-20 Thread skip

>> * New method (proposed by Shane Holloway):  s1.isdisjoint(s2).   

Mike> +1.  Disjointness verification is one of my main uses for set(),
Mike> and though I don't think that the early-out condition would
Mike> trigger often in my code, it would increase readability.

I think the readbility argument is marginal at best.  I use sets frequently
and to the greatest extent possible use the builtin operator support because
I find that more readable.  So for me, I'd be going from

if not s1 & s2:

to

if s1.isdisjoint(s2):

I'm not sure that's an improvement.

Maybe it's just me, but given two sets I frequently want to operate on
s1-s2, s2-s1 and s1&s2 in different ways.  I wouldn't find a disjoint
operation all that useful.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Summary of Tracker Issues

2007-05-20 Thread skip

talin> While it is true that there is an arms race between creators of
talin> social software applications and spammers, this arms race is only
talin> waged the largest scales - spammers simply won't spend the effort
talin> to go after individual sites, its not cost effective, especially
talin> when there are much more lucrative targets.

The advantage of choosing a couple simple topical questions means that in
theory, every Roundup installation can create a site-specific set of
questions.  If each site builds a small database of 10 or so questions then
chooses two or three at random for each submission, it seems that would make
Roundup, a very challenging system to hack in this regard.  It would also
likely be tough to use the porn site human proxy idea as well since
questions will (or ought to be) topical (what is the power operator?, what
does the "E" in R. E. Olds stand for?), not general (what star shines during
the day? what day preceeds Monday?)

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread skip

>>> What would be my choices for add math to the documentation?

>> Where in the current documentation is there any math notation /at
>> all/?

Georg> There is exactly one instance of LaTeX math in the whole docs,
Georg> it's in the description of audioop, AFAIR, an contains a sum over
Georg> square roots...

Georg> So, that's not really a concern of mine ;)

You must realize that people will use the core tools to create documentation
for third party packages which aren't in the core.  If you replace LaTeX
with something else I think you need to keep math in mind whether it's used
in the core documentation or not.

Skip
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Strange behaviour with PyEval_EvalCode

2007-05-20 Thread Joe Eagar
[I'm re-sending this message cause it might've gotten lost; sorry if it
ends up posting twice]
Hi I'm getting extremely odd behavior.  First of all, why isn't
PyEval_EvalCode documented  anywhere?  Anyway, I'm working on blender's
python integration (it embeds python, as opposed to python embedding
it).  I have a function that executes a string buffer of python code,
fetches a function from its global dictionary then calls it.

When the function code returns a local variable, PyObject_Call() appears
to be returning garbage.  The initial implementation used the same
dictionary for the global and local dicts.  I tried using separate
dicts, but then the function wasn't being called at all (or at least I
tested it by putting a "print "bleh"" in there, and it didn't work).

I've tested with both python 2.4 and 2.5.   Mostly with 2.4.  This bug
may be cropping up in other experimental blender python code  as well.

Here's the code in the string buffer:
#BPYCONSTRAINT
from Blender import *
from Blender.Mathutils import *
print "d"
def doConstraint(inmat, tarmat, prop):
a = Matrix()
a.identity()
a = a * TranslationMatrix(Vector(0, 0, 0))
print "t"
a = tarmat
return inmat

print doConstraint(Matrix(), Matrix(), 0)

Here's the code that executes the string buffer:

PyObject *RunPython2( Text * text, PyObject * globaldict, PyObject
*localdict )
{
char *buf = NULL;

/* The script text is compiled to Python bytecode and saved at
text->compiled
* to speed-up execution if the user executes the script multiple times */

if( !text->compiled ) {// if it wasn't already compiled, do it now
buf = txt_to_buf( text );

text->compiled =
Py_CompileString( buf, GetName( text ),
  Py_file_input );

MEM_freeN( buf );

if( PyErr_Occurred(  ) ) {
BPY_free_compiled_text( text );
return NULL;
}

}
return PyEval_EvalCode( text->compiled, globaldict, localdict );
}


. . .and heres the (rather long, and somewhat in a working state)
function that calls the function in the script's global dictionary:

void BPY_pyconstraint_eval(bPythonConstraint *con, float obmat[][4],
short ownertype, void *ownerdata, float targetmat[][4])
{
PyObject *srcmat, *tarmat, *idprop;
PyObject *globals, *locals;
PyObject *gkey, *gval;
PyObject *retval;
MatrixObject *retmat;
Py_ssize_t ppos = 0;
int row, col;

if ( !con->text ) return;

globals = CreateGlobalDictionary();

srcmat = newMatrixObject( (float*)obmat, 4, 4, Py_NEW );
tarmat = newMatrixObject( (float*)targetmat, 4, 4, Py_NEW );
idprop = BPy_Wrap_IDProperty( NULL, &con->prop, NULL);

/*  since I can't remember what the armature weakrefs do, I'll just
leave this here
commented out.  Since this function was based on pydrivers.
if( !setup_armature_weakrefs()){
fprintf( stderr, "Oops - weakref dict setup\n");
return result;
}
*/
retval = RunPython2( con->text, globals, globals);

if (retval) {Py_XDECREF( retval );}

if ( retval == NULL ) {
BPY_Err_Handle(con->text->id.name);
ReleaseGlobalDictionary( globals );

/*free temp objects*/
Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
return;
}

/*Now for the fun part! Try and find the functions we need.*/
while ( PyDict_Next(globals, &ppos, &gkey, &gval) ) {
if ( PyString_Check(gkey) && strcmp(PyString_AsString(gkey),
"doConstraint")==0 ) {
if (PyFunction_Check(gval) ) {
retval = PyObject_CallObject(gval, Py_BuildValue("OOO",
srcmat, tarmat, idprop));
Py_XDECREF( retval );
} else {
printf("ERROR: doConstraint is supposed to be a
function!\n");
}
break;
}
}

if (!retval) {
BPY_Err_Handle(con->text->id.name);
/*free temp objects*/
ReleaseGlobalDictionary( globals );

Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
return;
}

if (!PyObject_TypeCheck(retval, &matrix_Type)) {
printf("Error in pyconstraint: Wrong return type for a
pyconstraint!\n");
ReleaseGlobalDictionary( globals );

Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
Py_XDECREF( retval );
return;
}

retmat = (MatrixObject*) retval;
if (retmat->rowSize != 4 || retmat->colSize != 4) {
printf("Error in pyconstraint: Matrix is the wrong size!\n");
ReleaseGlobalDictionary( globals );

Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
Py_XDECREF( retval );
return;
}

//this is the reverse of code taken from newMatrix().
for(row = 0; row < 4; row++) {
for(col = 0; col < 4; col++) {
if (retmat->wrapped) obmat[row][col] 

[Python-Dev] Strange behaviour with PyEval_EvalCode

2007-05-20 Thread Joe Eagar
Hi I'm getting extremely odd behavior.  First of all, why isn't 
PyEval_EvalCode documented  anywhere?  Anyway, I'm working on blender's 
python integration (it embeds python, as opposed to python embedding 
it).  I have a function that executes a string buffer of python code, 
fetches a function from its global dictionary then calls it. 

When the function code returns a local variable, PyObject_Call() appears 
to be returning garbage.  The initial implementation used the same 
dictionary for the global and local dicts.  I tried using separate 
dicts, but then the function wasn't being called at all (or at least I 
tested it by putting a "print "bleh"" in there, and it didn't work).

I've tested with both python 2.4 and 2.5.   Mostly with 2.4.  This bug 
may be cropping up in other experimental blender python code  as well.

Here's the code in the string buffer:
#BPYCONSTRAINT
from Blender import *
from Blender.Mathutils import *
print "d"
def doConstraint(inmat, tarmat, prop):
a = Matrix()
a.identity()
a = a * TranslationMatrix(Vector(0, 0, 0))
print "t"
a = tarmat
return inmat

print doConstraint(Matrix(), Matrix(), 0)

Here's the code that executes the string buffer:

PyObject *RunPython2( Text * text, PyObject * globaldict, PyObject 
*localdict )
{
char *buf = NULL;

/* The script text is compiled to Python bytecode and saved at 
text->compiled
 * to speed-up execution if the user executes the script multiple times */

if( !text->compiled ) {// if it wasn't already compiled, do it now
buf = txt_to_buf( text );

text->compiled =
Py_CompileString( buf, GetName( text ),
  Py_file_input );

MEM_freeN( buf );

if( PyErr_Occurred(  ) ) {
BPY_free_compiled_text( text );
return NULL;
}

}
return PyEval_EvalCode( text->compiled, globaldict, localdict );
}


. . .and heres the (rather long, and somewhat in a working state) 
function that calls the function in the script's global dictionary:

void BPY_pyconstraint_eval(bPythonConstraint *con, float obmat[][4], 
short ownertype, void *ownerdata, float targetmat[][4])
{
PyObject *srcmat, *tarmat, *idprop;
PyObject *globals, *locals;
PyObject *gkey, *gval;
PyObject *retval;
MatrixObject *retmat;
Py_ssize_t ppos = 0;
int row, col;
   
if ( !con->text ) return;
   
globals = CreateGlobalDictionary();
   
srcmat = newMatrixObject( (float*)obmat, 4, 4, Py_NEW );
tarmat = newMatrixObject( (float*)targetmat, 4, 4, Py_NEW );
idprop = BPy_Wrap_IDProperty( NULL, &con->prop, NULL);
   
/*  since I can't remember what the armature weakrefs do, I'll just 
leave this here
commented out.  Since this function was based on pydrivers.
if( !setup_armature_weakrefs()){
fprintf( stderr, "Oops - weakref dict setup\n");
return result;
}
*/
retval = RunPython2( con->text, globals, globals);
   
if (retval) {Py_XDECREF( retval );}

if ( retval == NULL ) {
BPY_Err_Handle(con->text->id.name);
ReleaseGlobalDictionary( globals );
   
/*free temp objects*/
Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
return;
}
   
/*Now for the fun part! Try and find the functions we need.*/
while ( PyDict_Next(globals, &ppos, &gkey, &gval) ) {
if ( PyString_Check(gkey) && strcmp(PyString_AsString(gkey), 
"doConstraint")==0 ) {
if (PyFunction_Check(gval) ) {
retval = PyObject_CallObject(gval, Py_BuildValue("OOO", 
srcmat, tarmat, idprop));
Py_XDECREF( retval );
} else {
printf("ERROR: doConstraint is supposed to be a 
function!\n");
}
break;
}
}
   
if (!retval) {
BPY_Err_Handle(con->text->id.name);
/*free temp objects*/
ReleaseGlobalDictionary( globals );
   
Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
return;
}
   
if (!PyObject_TypeCheck(retval, &matrix_Type)) {
printf("Error in pyconstraint: Wrong return type for a 
pyconstraint!\n");
ReleaseGlobalDictionary( globals );
   
Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
Py_XDECREF( retval );
return;
}
   
retmat = (MatrixObject*) retval;
if (retmat->rowSize != 4 || retmat->colSize != 4) {
printf("Error in pyconstraint: Matrix is the wrong size!\n");
ReleaseGlobalDictionary( globals );
   
Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
Py_XDECREF( retval );
return;
}   

//this is the reverse of code taken from newMatrix().
for(row = 0; row < 4; row++) {
for(col = 0; col < 4; col++) {
if (retmat->wrapped) obmat[row][col] = 
retmat->data.b

[Python-Dev] Strange behaviour with PyEval_EvalCode

2007-05-20 Thread Joe Eagar
[I'm re-sending this message cause it might've gotten lost; sorry if it
ends up posting twice]
Hi I'm getting extremely odd behavior.  First of all, why isn't
PyEval_EvalCode documented  anywhere?  Anyway, I'm working on blender's
python integration (it embeds python, as opposed to python embedding
it).  I have a function that executes a string buffer of python code,
fetches a function from its global dictionary then calls it.

When the function code returns a local variable, PyObject_Call() appears
to be returning garbage.  The initial implementation used the same
dictionary for the global and local dicts.  I tried using separate
dicts, but then the function wasn't being called at all (or at least I
tested it by putting a "print "bleh"" in there, and it didn't work).

I've tested with both python 2.4 and 2.5.   Mostly with 2.4.  This bug
may be cropping up in other experimental blender python code  as well.

Here's the code in the string buffer:
#BPYCONSTRAINT
from Blender import *
from Blender.Mathutils import *
print "d"
def doConstraint(inmat, tarmat, prop):
a = Matrix()
a.identity()
a = a * TranslationMatrix(Vector(0, 0, 0))
print "t"
a = tarmat
return inmat

print doConstraint(Matrix(), Matrix(), 0)

Here's the code that executes the string buffer:

PyObject *RunPython2( Text * text, PyObject * globaldict, PyObject
*localdict )
{
char *buf = NULL;

/* The script text is compiled to Python bytecode and saved at
text->compiled
* to speed-up execution if the user executes the script multiple times */

if( !text->compiled ) {// if it wasn't already compiled, do it now
buf = txt_to_buf( text );

text->compiled =
Py_CompileString( buf, GetName( text ),
  Py_file_input );

MEM_freeN( buf );

if( PyErr_Occurred(  ) ) {
BPY_free_compiled_text( text );
return NULL;
}

}
return PyEval_EvalCode( text->compiled, globaldict, localdict );
}


. . .and heres the (rather long, and somewhat in a working state)
function that calls the function in the script's global dictionary:

void BPY_pyconstraint_eval(bPythonConstraint *con, float obmat[][4],
short ownertype, void *ownerdata, float targetmat[][4])
{
PyObject *srcmat, *tarmat, *idprop;
PyObject *globals, *locals;
PyObject *gkey, *gval;
PyObject *retval;
MatrixObject *retmat;
Py_ssize_t ppos = 0;
int row, col;

if ( !con->text ) return;

globals = CreateGlobalDictionary();

srcmat = newMatrixObject( (float*)obmat, 4, 4, Py_NEW );
tarmat = newMatrixObject( (float*)targetmat, 4, 4, Py_NEW );
idprop = BPy_Wrap_IDProperty( NULL, &con->prop, NULL);

/*  since I can't remember what the armature weakrefs do, I'll just
leave this here
commented out.  Since this function was based on pydrivers.
if( !setup_armature_weakrefs()){
fprintf( stderr, "Oops - weakref dict setup\n");
return result;
}
*/
retval = RunPython2( con->text, globals, globals);

if (retval) {Py_XDECREF( retval );}

if ( retval == NULL ) {
BPY_Err_Handle(con->text->id.name);
ReleaseGlobalDictionary( globals );

/*free temp objects*/
Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
return;
}

/*Now for the fun part! Try and find the functions we need.*/
while ( PyDict_Next(globals, &ppos, &gkey, &gval) ) {
if ( PyString_Check(gkey) && strcmp(PyString_AsString(gkey),
"doConstraint")==0 ) {
if (PyFunction_Check(gval) ) {
retval = PyObject_CallObject(gval, Py_BuildValue("OOO",
srcmat, tarmat, idprop));
Py_XDECREF( retval );
} else {
printf("ERROR: doConstraint is supposed to be a
function!\n");
}
break;
}
}

if (!retval) {
BPY_Err_Handle(con->text->id.name);
/*free temp objects*/
ReleaseGlobalDictionary( globals );

Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
return;
}

if (!PyObject_TypeCheck(retval, &matrix_Type)) {
printf("Error in pyconstraint: Wrong return type for a
pyconstraint!\n");
ReleaseGlobalDictionary( globals );

Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
Py_XDECREF( retval );
return;
}

retmat = (MatrixObject*) retval;
if (retmat->rowSize != 4 || retmat->colSize != 4) {
printf("Error in pyconstraint: Matrix is the wrong size!\n");
ReleaseGlobalDictionary( globals );

Py_XDECREF( idprop );
Py_XDECREF( srcmat );
Py_XDECREF( tarmat );
Py_XDECREF( retval );
return;
}

//this is the reverse of code taken from newMatrix().
for(row = 0; row < 4; row++) {
for(col = 0; col < 4; col++) {
if (retmat->wrapped) obmat[row][col] 

Re: [Python-Dev] [Doc-SIG] The docs, reloaded

2007-05-20 Thread John Gabriele
On 5/19/07, Georg Brandl <[EMAIL PROTECTED]> wrote:
>
> [snip]
>
> Waiting for comments!

Awesome, Georg! Wow. Nice work.

Seems like this has been a long time comin', and I bet others have
been working away "in secret" on similar projects. I hope you keep
running with it until it gets hijacked into being the "official"
versions. :)

I'm bookmarking it as "python docs" in my browser.

BTW, would like to see a little blurb of your own on that page about
how the docs were converted, rendered, and their new source format.

Thanks much,
---John

P.S. -- funny sig, btw. :)
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Doc-SIG] The docs, reloaded

2007-05-20 Thread Lea Wiemann
Martin Blais wrote:

> e.g. are you still marking classes as classes
> and functions as functions in the ReST source

It seems so (modulo XXX's and TODO's in Georg's implementation, probably
^_^) -- all of the pages have "show source" links, so you can see for
yourself.  I'm not an expert with the documentation system, but the
markup on  looks pretty
complete to me.

> (Somewhat related, but another idea from back then, which was never
> implemented IMO, was to find a way to automatically pull and convert
> the docstrings from the source code into the documentation, thus
> unifying all the information in one place.)

While it's probably not possible to simply generate the documentation
from the docstrings, it would certainly seem interesting to get have
some means (like a directive) to pull docstrings into the documentation.
 I think however that while migrating the docs do reStructuredText is
comparatively straightforward [1]_, pulling documentation from the
docstrings will require quite a bit of design and discussion work.  So
I'd suggest we postpone this idea until we have a working documentation
system in reStructuredText, so we don't clutter the discussion.

.. [1] I'm sure there will still be quite a few issues to sort out that
   I'm simply not seeing right now.

Best wishes,

Lea
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] [Doc-SIG] The docs, reloaded

2007-05-20 Thread Gael Varoquaux
On Sat, May 19, 2007 at 03:31:59PM -0500, Ron Adam wrote:
> - The html syntax highlighters.   (Pydoc can use those)

I have a patch on the docutils patch tracker that does this. Code is
probably of a rather bad quality, but it outputs LaTeX and HTML. If we
can work together to improve this patch and get it in docutils it will
avoid having different syntaxes and behavior depending on the front-end
to docutils being used (I am thinking of rest2web, trac, and I am
probably forgetting some others).

The patch has been sitting there for almost 6 months without review, but
I have that if people other than me work on it and ask for review it will
both improve, and get reviewed, and eventually get in !

Sorry for the shameless plug, but I really do think we need a unifying
approach to this.

Gaël
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Martin Blais
On 5/20/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
>
> Georg> There is exactly one instance of LaTeX math in the whole docs,
> Georg> it's in the description of audioop, AFAIR, an contains a sum over
> Georg> square roots...
>
> Georg> So, that's not really a concern of mine ;)
>
> You must realize that people will use the core tools to create documentation
> for third party packages which aren't in the core.  If you replace LaTeX
> with something else I think you need to keep math in mind whether it's used
> in the core documentation or not.

IMHO the question of math support in ReST is one that should be best
answered at the level of docutils, instead of Georg.  A number of
discussions on that topic have already taken place.
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Adventures with x64, VS7 and VS8 on Windows

2007-05-20 Thread Mark Hammond
Hi all,
  I hope the cross-post is appropriate.

  I've started playing with getting the pywin32 extensions building under
the AMD64 architecture.  I started building with Visual Studio 8 (it was
what I had handy) and I struck a few issues relating to the compiler version
that I thought worth sharing.

* In trying to build x64 from a 32bit VS7 (ie, cross-compiling via the
PCBuild directory), the python.exe project fails with:

pythoncore fatal error LNK1112: module machine type 'X86' conflicts with
target machine type 'AMD64'

  is this a known issue, or am I doing something wrong?

* The PCBuild8 project files appear to work without modification (I only
tried native compilation here though, not a cross-compile) - however, unlike
the PCBuild directory, they place all binaries in a 'PCBuild8/x64'
directory.  While this means that its possible to build for multiple
architectures from the same source tree, it makes life harder for tools like
'distutils' - eg, distutils already knows about the 'PCBuild' directory, but
it knows nothing about either PCBuild8 or PCBuild8/x64.

A number of other build processes also know to look inside a PCBuild
directory (eg, Mozilla), so instead of formalizing PCBuild8, I think we
should merge PCBuild8 into PCBuild.  This could mean PCBuild/vs7 and
PCBuild/vs8 directories with the "project" files, but binaries would still
be generated in the 'PCBuild' (or PCBuild/x64) directory.  This would mean
the same tree isn't capable of hosting 2 builds from different VS compilers,
but I think that is reasonable (if it's a problem, just have multiple source
directories).  I understand that PCBuild8 is not "official", but in the
assumption that future versions of Python will use a compiler later than
VS7, it makes sense to me to clean this up now - what are others opinions on
this?

* Re the x64 directory used by the PCBuild8 process.  IMO, it makes sense to
generate x64 binaries to their own directory - my expectation is that
cross-compiling between platforms is a reasonable use-case, and we should
support multiple achitectures for the same compiler version.  This would
mean formalizing the x64 directory in both 'PCBuild' and distutils, and
leaving other external build processes to update as they support x64 builds.
Does this make sense?  Would this fatally break other scripts used for
packaging (eg, the MSI framework)?

* Wide characters in VS8: PC/pyconfig.h defines PY_UNICODE_TYPE as 'unsigned
short', which corresponds with both 'WCHAR' and 'wchar' in previous compiler
versions.  VS8 defines this as wchar_t, which I'm struggling to find a
formal definition for beyond being 2 bytes.  My problem is that code which
assumes a 'Py_UNICODE *' could be used in place of a 'WCHAR *' now fails.  I
believe the intent on Windows has always been "PyUNICODE == 'native
unicode'" - should PC/pyconfig.h reflect this (ie, should pyconfig.h grow a
version specific definition of PyUNICODE as wchar_t)?

* Finally, as something from left-field which may well take 12 months or
more to pull off - but would there be any interest to moving the Windows
build process to a cygwin environment based on the existing autoconf
scripts?  I know a couple of projects are doing this successfully, including
Mozilla, so it has precendent.  It does impose a greater burden on people
trying to build on Windows, but I'd suggest that in recent times, many
people who are likely to want to build Python on Windows are already likely
to have a cygwin environment.  Simpler mingw builds and nuking MSVC specific
build stuff are among the advantages this would bring.  It is not worth
adding this as "yet another windows build option" - so IMO it is only worth
progressing with if it became the "blessed" build process for windows - if
there is support for this, I'll work on it as the opportunity presents
itself...

I'm (obviously) only suggesting we do this on the trunk and am happy to make
all agreed changes - but I welcome all suggestions or critisisms of this
endeavour...

Cheers,

Mark

___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Martin v. Löwis
> Georg> So, that's not really a concern of mine ;)
> 
> You must realize that people will use the core tools to create documentation
> for third party packages which aren't in the core.  If you replace LaTeX
> with something else I think you need to keep math in mind whether it's used
> in the core documentation or not.

I disagree. The documentation infrastructure of Python should only
consider the needs of Python itself. If other people can use that
infrastructure for other purposes, fine - if they find that it does
not meet their needs, they have to look elsewhere.

We are developing a programming language here, not a typesetting
system.

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Introduction and request for commit access to the sandbox.

2007-05-20 Thread Martin v. Löwis
> With that said, I would to request svn access to the sandbox for my
> work. I will use this access only for modifying stuff in the directory
> I will be assigned to. I would like to use the username "avassalotti"
> and the attached SSH2 public key for this access.

I have added your key. As we have a strict first.last account policy,
I named it alexandre.vassalotti; please correct me if I misspelled it.

> One last thing, if you know semantic differences (other than the
> obvious ones) between the C and Python versions of the modules I need
> to merge, please let know. This will greatly simplify the merge and
> reduce the chances of later breaking.

Somebody noticed on c.l.p that, for cPickle,
a) cPickle will start memo keys at 1; pickle at 0
b) cPickle will not put things into the memo if their refcount is
   1, whereas pickle puts everything into the memo.

Not sure what you'd consider obvious, but I'll mention that cStringIO
"obviously" is constrained in what data types you can write (namely,
byte strings only), whereas StringIO allows Unicode strings as well.
Less obviously, StringIO also allows

py> s = StringIO(0)
py> s.write(10)
py> s.write(20)
py> s.getvalue()
'1020'

Regards,
Martin
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Adventures with x64, VS7 and VS8 on Windows

2007-05-20 Thread Martin v. Löwis
> * In trying to build x64 from a 32bit VS7 (ie, cross-compiling via the
> PCBuild directory), the python.exe project fails with:
> 
> pythoncore fatal error LNK1112: module machine type 'X86' conflicts with
> target machine type 'AMD64'
> 
>   is this a known issue, or am I doing something wrong?

You are likely doing something wrong:
a) I assume it's VS 7.1 (i.e. VS.NET 2003); VS 2002 is not supported
   at all
b) you probably didn't install vsextcomp, but you should.
   In fact, you don't need all of it, but you do need the cl.exe and
   link.exe wrappers it comes with - they dispatch to the proper
   tools from the SDK
c) in case it isn't clear: you also need an AMD64 compiler, e.g.
   from the platform SDK.
   Unfortunately, Microsoft keeps changing the registry settings for
   the SDK, so vsextcomp only knows about some selected SDKs. If
   that causes a problem, please let me know.

> * The PCBuild8 project files appear to work without modification (I only
> tried native compilation here though, not a cross-compile) - however, unlike
> the PCBuild directory, they place all binaries in a 'PCBuild8/x64'
> directory.  While this means that its possible to build for multiple
> architectures from the same source tree, it makes life harder for tools like
> 'distutils' - eg, distutils already knows about the 'PCBuild' directory, but
> it knows nothing about either PCBuild8 or PCBuild8/x64.

This is an issue to be discussed for Python 2.6. I'm personally hesitant
to have the "official" build infrastructure deviate from the layout that
has been in-use for so many years, as a lot of things depend on it.

I don't find the need to have separate object directories convincing:
For building the Win32/Win64 binaries, I have separate checkouts
*anyway*, since all the add-on libraries would have to support
multi-arch builds, but I think they don't.

> A number of other build processes also know to look inside a PCBuild
> directory (eg, Mozilla), so instead of formalizing PCBuild8, I think we
> should merge PCBuild8 into PCBuild.

Right - PCbuild8 should not get formalized. It probably should continue
to be maintained.

For 2.6, the first question to answer is: what compiler should it use?

I would personally like to see Python "skip" VS 2005 altogether,
as it will be soon superceded by Orcas. Unfortunately, it's unclear
how long Microsoft will need to release Orcas (and also, when Python
2.6 will be released), so I would like to defer that question by
a few months.

> I understand that PCBuild8 is not "official", but in the
> assumption that future versions of Python will use a compiler later than
> VS7, it makes sense to me to clean this up now - what are others opinions on
> this?

Not "official" really only means "not used to build the official
binaries" - just like PC/VC6. It's still (somewhat) maintained.

As for cleaning it up - see above. I would *really* like to skip
VS 2005 altogether, as I expect that soon after we decide to use
VS 2005, Microsoft will replace it with the next release, stop
supporting VS 2005, take the free compiler off the next, and
so on (just like they did for VS 2003, soon after we decided to
use it for 2.5).

> * Re the x64 directory used by the PCBuild8 process.  IMO, it makes sense to
> generate x64 binaries to their own directory - my expectation is that
> cross-compiling between platforms is a reasonable use-case, and we should
> support multiple achitectures for the same compiler version. 

See above; I disagree. First, "multiple architectures" only means x86,
AMD64, and Itanium, and I would like to drop "official" Itanium binaries
from 2.6 (even though they could continue to be supported in the build
process). Then, even if the Python build itself support multiple
simultaneous architectures, the extension modules don't all (correct
me if I'm wrong).

> This would
> mean formalizing the x64 directory in both 'PCBuild' and distutils, and
> leaving other external build processes to update as they support x64 builds.
> Does this make sense?  Would this fatally break other scripts used for
> packaging (eg, the MSI framework)?

The MSI packaging would need to be changed, certainly. It currently
detects the architecture it needs to package by looking at the file
type of python.exe; that would have to be changed to give it an
explicit parameter what architecture to package, or have it package
all architectures it can find.

> * Wide characters in VS8: PC/pyconfig.h defines PY_UNICODE_TYPE as 'unsigned
> short', which corresponds with both 'WCHAR' and 'wchar' in previous compiler
> versions.  VS8 defines this as wchar_t, which I'm struggling to find a
> formal definition for beyond being 2 bytes.

In C or in C++? In C++, wchar_t is a builtin type, just like short, int,
long. So there is no further formal definition.

In C (including C99), wchar_t ought to be defined in stddef.h.

> My problem is that code which
> assumes a 'Py_UNICODE *' could be used in place of a 'WCHAR *' now fails

Re: [Python-Dev] The docs, reloaded

2007-05-20 Thread Brett Cannon

On 5/20/07, "Martin v. Löwis" <[EMAIL PROTECTED]> wrote:


> Georg> So, that's not really a concern of mine ;)
>
> You must realize that people will use the core tools to create
documentation
> for third party packages which aren't in the core.  If you replace LaTeX
> with something else I think you need to keep math in mind whether it's
used
> in the core documentation or not.

I disagree. The documentation infrastructure of Python should only
consider the needs of Python itself. If other people can use that
infrastructure for other purposes, fine - if they find that it does
not meet their needs, they have to look elsewhere.




Martin beat me to my comment.  =)  Python's needs should come first,
period.  If Georg wants to add math support, fine.  But honestly I would
rather he spend his time on Python-specific stuff then get bogged down to
support possible third parties.

-Brett
___
Python-Dev mailing list
Python-Dev@python.org
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Adventures with x64, VS7 and VS8 on Windows

2007-05-20 Thread Mark Hammond
Hi Martin,

> You are likely doing something wrong:
> a) I assume it's VS 7.1 (i.e. VS.NET 2003); VS 2002 is not supported
>at all
> b) you probably didn't install vsextcomp, but you should.
>In fact, you don't need all of it, but you do need the cl.exe and
>link.exe wrappers it comes with - they dispatch to the proper
>tools from the SDK
> c) in case it isn't clear: you also need an AMD64 compiler, e.g.
>from the platform SDK.
>Unfortunately, Microsoft keeps changing the registry settings for
>the SDK, so vsextcomp only knows about some selected SDKs. If
>that causes a problem, please let me know.

I'm using the full-blown VS.NET 2003, as given to a number of python-dev
people by Microsoft a number of years ago.  This appears to come with the
SDK and a 64bit compiler.  I'm guessing vsextcomp doesn't use the Visual
Studio 'ReleaseAMD64' configuration - would it be OK for me to check in
changes to the PCBuild projects for this configuration?

> This is an issue to be discussed for Python 2.6. I'm
> personally hesitant
> to have the "official" build infrastructure deviate from the
> layout that
> has been in-use for so many years, as a lot of things depend on it.

Yes, I agree - although I consider x64 new enough that an opportunity exists
to set a new 'standard'.  However, if most 'external' build processes will
not otherwise need to change for a 64bit environment, then I agree that
nothing should change in Python's layout.

> Right - PCbuild8 should not get formalized. It probably
> should continue to be maintained.

So is there something we can do to make distutils play better with binaries
built from PCBuild8, even though it is considered temporary?  It seems the
best thing might be to modify the PCBuild8 build process so the output
binaries are in the ../PCBuild' directory - this way distutils and others
continue to work fine.  Does that sound reasonable?

> For 2.6, the first question to answer is: what compiler should it use?
>
> I would personally like to see Python "skip" VS 2005 altogether,
> as it will be soon superceded by Orcas. Unfortunately, it's unclear
> how long Microsoft will need to release Orcas (and also, when Python
> 2.6 will be released), so I would like to defer that question by
> a few months.

I've no objection to that - but I'd like to help keep the pain to a minimum
for people who find themselves trying to build 64bit extensions in the
meantime.   Anecdotally, VS8 is the compiler most people start trying to use
for this (quite possibly because that is what they already have handy).

> See above; I disagree. First, "multiple architectures" only means x86,
> AMD64, and Itanium, and I would like to drop "official"
> Itanium binaries
> from 2.6 (even though they could continue to be supported in the build
> process). Then, even if the Python build itself support multiple
> simultaneous architectures, the extension modules don't all (correct
> me if I'm wrong).

Yes, I agree that it is unlikely to work in practice - at least for a number
of years as the external libs and extensions catch up.

> > * Wide characters in VS8: PC/pyconfig.h defines
> PY_UNICODE_TYPE as 'unsigned
> > short', which corresponds with both 'WCHAR' and 'wchar' in
> previous compiler
> > versions.  VS8 defines this as wchar_t, which I'm
> struggling to find a
> > formal definition for beyond being 2 bytes.
>
> In C or in C++? In C++, wchar_t is a builtin type, just like
> short, int,
> long. So there is no further formal definition.

This was in C++, but the problem was really WCHAR, as used by much of the
win32 API.

> I'd rather make it a platform-specific definition (for
> platform=Windows
> API). Correct me if I'm wrong, but isn't wchar_t also available in VS
> 2003 (and even in VC6?). And doesn't it have the "right" definition in
> all these compilers?

hrm - as above, I'm more concerned with the definition of WCHAR - which
means my problem is related more to the Platform SDK version rather than the
compiler.  This is unfortunate - on one hand we do consider
'platform=Windows API', and WCHAR is very much an API concept.  I'll need to
dig some more into this, but at least I know I'm not wasting my time :)

> > * Finally, as something from left-field which may well take
> > 12 months or
> > more to pull off - but would there be any interest to
> > moving the Windows
> > build process to a cygwin environment based on the existing autoconf
> > scripts?
>
> What compiler would you use there? I very much like using the VS
> debugger when developing on Windows, so that capability should not
> go away.

You would use whatever compiler the autoconf toolset found.  Recent versions
know enough about MSVC for simple projects.  Many people would need to take
care that their environment pointed at the correct compiler - especially the
person building releases.

But assuming MSVC was found and had the appropriate switches passed, there
would be no impact on the ability to use Visual Studio