[issue47154] -arch detection in _osx_support generates false positives
Change by Isuru Fernando : -- keywords: +patch pull_requests: +30256 stage: -> patch review pull_request: https://github.com/python/cpython/pull/32178 ___ Python tracker <https://bugs.python.org/issue47154> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue47154] -arch detection in _osx_support generates false positives
New submission from Isuru Fernando : If `cflags` contains something like, `-I/Users/isuru/handy-archives-env/include`, then the code assumes that there is a `-arch` flag in it and errors with ValueError: Don't know machine value for archs=() -- components: Build messages: 416278 nosy: isuruf priority: normal severity: normal status: open title: -arch detection in _osx_support generates false positives versions: Python 3.10, Python 3.11, Python 3.9 ___ Python tracker <https://bugs.python.org/issue47154> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46412] PyQT6 projects crashes with python 3.10
Fernando Pérez Gómez added the comment: I have to perform several tests to provide a detailed report, check other third party libraries, mysql connector for example, and send it to you. There are several things that don't work with Qt libs (the one that works best is PyQt6). When I finish the tests and have the report, I'll send it to you. In the meantime I suggest you search for pyqt6 + python3.10 on the web. There are plenty of complaints on all platforms (Windows, Mac and Linux). My system is Manjaro Linux , kernel version 5.15-2 , the PyQt6 version installed is 6.22.3 . So far I have verified that it is not the fault of the IDE and that it does not install the plugins via pip . Of course I will report the same to River Bank Computing. Sorry I can't be more specific at the moment, because python has been updated on my system recently. Thanks for your patience , I 'll send you a full report in a couple of days . that takes work El lun, 17 ene 2022 a las 16:02, Ronald Oussoren () escribió: > > Ronald Oussoren added the comment: > > This is most likely a problem with PyQt6. Please ask that project first > (with a clearer description of what goes wrong and on which platforms). > > -- > nosy: +ronaldoussoren > resolution: -> third party > stage: -> resolved > status: open -> closed > > ___ > Python tracker > <https://bugs.python.org/issue46412> > ___ > -- ___ Python tracker <https://bugs.python.org/issue46412> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue46412] PyQT6 projects crashes with python 3.10
New submission from Fernando Pérez Gómez : can't translate ui. files , flags of alignment , curve types of animation doesn´t wors ...etc -- messages: 410791 nosy: fernandoprezgmez priority: normal severity: normal status: open title: PyQT6 projects crashes with python 3.10 type: crash versions: Python 3.10 ___ Python tracker <https://bugs.python.org/issue46412> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue44556] ctypes unittest crashes with libffi 3.4.2
Isuru Fernando added the comment: Duplicate of https://bugs.python.org/issue45350 -- nosy: +isuruf ___ Python tracker <https://bugs.python.org/issue44556> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue44182] python-config.sh vs python-config.py inconsistency
Change by Isuru Fernando : -- keywords: +patch pull_requests: +27071 stage: -> patch review pull_request: https://github.com/python/cpython/pull/28725 ___ Python tracker <https://bugs.python.org/issue44182> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue44182] python-config.sh vs python-config.py inconsistency
Isuru Fernando added the comment: Agree that this should be fixed. If you want, I can send a PR. -- nosy: +isuruf ___ Python tracker <https://bugs.python.org/issue44182> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40503] PEP 615: Add zoneinfo module
Isuru Fernando added the comment: Do you have a suggestion for how to make it configurable at compile time? In POSIX platforms, we can set `--with-tzpath` to make it configurable at compile time. -- ___ Python tracker <https://bugs.python.org/issue40503> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45258] sysroot_paths in setup.py does not consider -isysroot for macOS
Change by Isuru Fernando : -- keywords: +patch pull_requests: +26896 stage: -> patch review pull_request: https://github.com/python/cpython/pull/28501 ___ Python tracker <https://bugs.python.org/issue45258> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue45258] sysroot_paths in setup.py does not consider -isysroot for macOS
New submission from Isuru Fernando : It only looks at --sysroot which is Linux specific -- components: Build messages: 402338 nosy: FFY00, isuruf priority: normal severity: normal status: open title: sysroot_paths in setup.py does not consider -isysroot for macOS versions: Python 3.11 ___ Python tracker <https://bugs.python.org/issue45258> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40503] PEP 615: Add zoneinfo module
Isuru Fernando added the comment: Thanks @steve.dower for the info. I've created https://github.com/python/cpython/pull/28495. Let me know if it needs to have a separate bpo issue. -- ___ Python tracker <https://bugs.python.org/issue40503> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40503] PEP 615: Add zoneinfo module
Change by Isuru Fernando : -- pull_requests: +26890 pull_request: https://github.com/python/cpython/pull/28495 ___ Python tracker <https://bugs.python.org/issue40503> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40503] PEP 615: Add zoneinfo module
Isuru Fernando added the comment: > If anyone building Python for Windows shows up needing support for this, we > can re-visit the issue — I don't believe it's technically infeasible, just > that the usage patterns of Python on Windows mean that it's not worth doing. At conda-forge, we need this and our current solution is https://github.com/conda-forge/python-feedstock/blob/8195ba1178041b7461238e8c5680eee62f5ea9d0/recipe/patches/0032-Fix-TZPATH-on-windows.patch#L19 I can change to check if that directory exists. What do you think? -- nosy: +FFY00, isuruf ___ Python tracker <https://bugs.python.org/issue40503> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue22699] Module source files not found when cross-compiling
Change by Isuru Fernando : -- keywords: +patch nosy: +isuruf nosy_count: 8.0 -> 9.0 pull_requests: +26810 stage: -> patch review pull_request: https://github.com/python/cpython/pull/28397 ___ Python tracker <https://bugs.python.org/issue22699> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue41100] Support macOS 11 and Apple Silicon Macs
Change by Isuru Fernando : -- nosy: +isuruf nosy_count: 23.0 -> 24.0 pull_requests: +25069 pull_request: https://github.com/python/cpython/pull/26474 ___ Python tracker <https://bugs.python.org/issue41100> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue43052] _dyld_shared_cache_contains_path needs SYSTEM_VERSION_COMPAT=0
Isuru Fernando added the comment: You are right. I think I may have accidentally used the wrong SDK. Explictly setting `SYSTEM_VERSION_COMPAT=1` is unsupported then? -- ___ Python tracker <https://bugs.python.org/issue43052> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue43052] _dyld_shared_cache_contains_path needs SYSTEM_VERSION_COMPAT=0
New submission from Isuru Fernando : In macOS Big Sur, if the executable was compiled with MACOSX_DEPLOYMENT_TARGET=10.15 or below, then SYSTEM_VERSION_COMPAT=1 is the default which means that Big Sur reports itself as 10.16 which in turn means that __builtin_available(macOS 11.0) will not be triggered. This can be observed by using the python 3.9.1 universal2 installer and using it on x86_64 Big Sur or with Rossetta 2 on arm64 Big Sur. (Not an issue with native arm64 as that part is compiled with MACOSX_DEPLOYMENT_TARGET=11.0) Original issue is that the following returned None. SYSTEM_VERSION_COMPAT=1 arch -x86_64 /usr/local/bin/python3 -c "from ctypes.util import find_library; print(find_library('AppKit'))" -- messages: 385845 nosy: isuruf, ned.deily, ronaldoussoren priority: normal pull_requests: 23185 severity: normal status: open title: _dyld_shared_cache_contains_path needs SYSTEM_VERSION_COMPAT=0 ___ Python tracker <https://bugs.python.org/issue43052> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue42476] Allow cross compiling python for macos-arm64 from macos-x86_64
Change by Isuru Fernando : -- keywords: +patch pull_requests: +22407 stage: -> patch review pull_request: https://github.com/python/cpython/pull/23523 ___ Python tracker <https://bugs.python.org/issue42476> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue42476] Allow cross compiling python for macos-arm64 from macos-x86_64
New submission from Isuru Fernando : Only a few changes are needed and I will send a Pull request. This was used for providing macos-arm64 builds for conda where we are using cross compiling exclusively for all macos-arm64 builds -- components: Build messages: 381908 nosy: isuruf, willingc priority: normal severity: normal status: open title: Allow cross compiling python for macos-arm64 from macos-x86_64 type: enhancement versions: Python 3.10, Python 3.8, Python 3.9 ___ Python tracker <https://bugs.python.org/issue42476> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40154] embedded null byte when connecting to sqlite database using a bytes object
Fernando added the comment: Hello SilentGhost, Okay, now I understand the difference and had my code working! Thank you very much for your answer and to all of you who help in making Python better. (Wish I had more knowledge of it to help) Have a nice day! -- ___ Python tracker <https://bugs.python.org/issue40154> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40154] embedded null byte when connecting to sqlite database using a bytes object
Change by Fernando : -- components: +Extension Modules -IO ___ Python tracker <https://bugs.python.org/issue40154> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40154] embedded null byte when connecting to sqlite database using a bytes object
Fernando added the comment: bump? -- ___ Python tracker <https://bugs.python.org/issue40154> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue40154] embedded null byte when connecting to sqlite database using a bytes object
New submission from Fernando : Hello. I think that I found a bug in how sqlite3 module handle bytes. The connect function of sqlite3 accepts strings, FilePath objects and bytes. However, it's impossible for me to connect to bytes objects that are read from BufferedReaders. I always get: "ValueError: embedded null byte" This is my current code (byteDec is the BytesIO object): == byteDec.seek(0) conn = sqlite3.connect(byteDec.read()) == That returns the "embedded null byte" error. However, if I do: == byteDec.seek(0) with open("db.db", "wb" as f: f.write(byteDec.read()) conn = sqlite3.connect("db.db") == Everything works flawlessly, so the BufferedReader that I have in-memory is not corrupted in any way, as it's readable from a file. I want to avoid writing to disk at all, so this is not a solution for me. I attach to this issue a very basic proof of concept to understand the issue. I'm running Pyhton 3.8.2 amd64 on Windows 10 1909 -- components: IO files: bytes_io.py messages: 365582 nosy: ferferga priority: normal severity: normal status: open title: embedded null byte when connecting to sqlite database using a bytes object type: crash versions: Python 3.8 Added file: https://bugs.python.org/file49025/bytes_io.py ___ Python tracker <https://bugs.python.org/issue40154> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue33351] Support compiling with clang-cl on Windows
Change by Isuru Fernando : -- pull_requests: +17747 pull_request: https://github.com/python/cpython/pull/18371 ___ Python tracker <https://bugs.python.org/issue33351> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue37916] distutils: allow overriding of the RANLIB command on macOS (darwin)
Isuru Fernando added the comment: This is an issue even without cross-compilations as there is an environment variable for the archiver and therefore the archiver and ranlib can be from 2 different toolchains and leads to error. Ran into this issue in python 3.8 on macOS with an lto build -- nosy: +isuruf ___ Python tracker <https://bugs.python.org/issue37916> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32423] The Windows SDK version 10.0.15063.0 was not found
Isuru Fernando added the comment: Fixed in https://github.com/python/cpython/pull/12445 -- stage: -> resolved status: open -> closed ___ Python tracker <https://bugs.python.org/issue32423> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue32423] The Windows SDK version 10.0.15063.0 was not found
New submission from Isuru Fernando <isu...@gmail.com>: When compiling python 3.6.4 on appveyor using MSVC 2015 following error occurs. (C:\bld\python_1514037886491\_b_env) C:\bld\python_1514037886491\work\Python-3.6.4\PCbuild>"C:\Program Files (x86)\MSBuild\14.0\Bin\amd64\MSBuild.exe" "C:\bld\python_1514037886491\work\Python-3.6.4\PCbuild\pcbuild.proj" /t:Build /m /nologo /v:m /p:Configuration=Release /p:Platform=x64 /p:IncludeExternals=true /p:IncludeSSL=true /p:IncludeTkinter=true /p:UseTestMarker= /p:GIT="C:\Program Files\Git\cmd\git.exe" C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\V140\Platforms\x64\PlatformToolsets\v140\Toolset.targets(36,5): error MSB8036: The Windows SDK version 10.0.15063.0 was not found. Install the required version of Windows SDK or change the SDK version in the project property pages or by right-clicking the solution and selecting "Retarget solution". [C:\bld\python_1514037886491\work\Python-3.6.4\PCbuild\pythoncore.vcxproj] Note that appveyor Visual Studio 2015 image has only 10.0.10586, 10.0.14393 and 10.0.26624 Here's a simple patch that fixes this on 3.6 branch. https://github.com/isuruf/cpython/commit/9432a2c7f63b3bb55e8066e91eade81321154476 I haven't checked that the patch works on a machine with 10.0.15063 -- components: Windows messages: 308982 nosy: Isuru Fernando, paul.moore, steve.dower, tim.golden, zach.ware priority: normal severity: normal status: open title: The Windows SDK version 10.0.15063.0 was not found type: compile error versions: Python 3.6 ___ Python tracker <rep...@bugs.python.org> <https://bugs.python.org/issue32423> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Re: Python boilerplate
@all I released version 1.0.0 with a tiny glossary and explanation of each file in the boilerplate. @Chris I made the boilerplate with intent that everyone can understand, download and use quickly. So, I didn't put extra dependence like cookiecutter (that depends jinja, that depends markupsafe) to **just** replace fields and then run the project. I also preferred to use .md instead .rst because it's more clean in my opinion and used by default in platforms like GitHub and Stackoverflow. See mkdocs to generate documentation with markdown. In same way, I choose pytest because the default test framework is verbose and its CamelCase sux. About entry_points maybe I'll consider it too, but I didn't understand why packages are best than modules... both can be reusable and not every project needs packages. I looked your release shell script. It's very nice. In Flask GitHub repository has a pretty nice too. See it ~scripts/make-release.py. Thanks, -- https://mail.python.org/mailman/listinfo/python-list
Python boilerplate
A simple boilerplate for those who don't know the structure of a project. https://goo.gl/lJRvS6 ## Features * Build and distribute with setuptools * Check code style with flake8 * Make and run tests with pytest * Run tests on every Python version with tox * Code coverage with coverage.py ## Structure Structure of the project in tree format. ├── CONTRIBUTING.md ├── LICENSE ├── Makefile ├── MANIFEST.in ├── module_name.py ├── README.md ├── requirements │ ├── dev.txt │ └── prod.txt ├── requirements.txt ├── setup.cfg ├── setup.py ├── tests.py └── tox.ini Fernando Felix -- https://mail.python.org/mailman/listinfo/python-list
Re: "x == None" vs "x is None"
> I have seen at several places "x == None" and "x is None" within > if-statements. > What is the difference? > Which term should I prefer and why? > > > -- > Ullrich Horlacher Server und Virtualisierung > Rechenzentrum IZUS/TIK E-Mail: horlac...@tik.uni-stuttgart.de > Universitaet Stuttgart Tel:++49-711-68565868 > Allmandring 30aFax:++49-711-682357 > 70550 Stuttgart (Germany) WWW:http://www.tik.uni-stuttgart.de/ In this case, the Style Guide for Python Code [1] recommends use "is or is not, never the equality operators" [2]. If you use the pep8 tool [3] to check your code, the error code that will be raised is E711 [4]. [1] https://www.python.org/dev/peps/pep-0008/ [2] https://www.python.org/dev/peps/pep-0008/#programming-recommendations [3] http://pep8.readthedocs.org/en/latest/intro.html [4] http://pep8.readthedocs.org/en/latest/intro.html#error-codes -- https://mail.python.org/mailman/listinfo/python-list
Observer pattern implementation in Python based on jQuery
Hi, I made an observer/pubsub pattern implementation in Python based on jQuery. It's available in https://github.com/fernandojunior/python-pattern-observer. Take a look! :) Fernando Felix https://br.linkedin.com/in/fernandofnjr -- https://mail.python.org/mailman/listinfo/python-list
First Brazilian programming MOOC with 10.000 enrolled
Python for Zombies [1] is the first MOOC in portuguese to teach programming. Today we have 10.000 enrolled in 1013 cities from Brazil. We started five months ago. The website is a Django application. [1] http://pycursos.com/python-para-zumbis/ -- https://mail.python.org/mailman/listinfo/python-list
Python for Myo Control Armband
Does anyone know if there is already a Python library for Myo [1] Gesture Control Armband? There are other gesture controls with Python libraries? Thanks in advance [1] https://www.thalmic.com/en/myo/ -- https://mail.python.org/mailman/listinfo/python-list
Check for the type of arguments
I am new to Python, with some experience in Java, C++ and R. Writing in other languages I usually check the type and values of function arguments. In the Python code examples I have seen this is rarely done. Questions: 1) Is this because it would be unpythonic or just because the examples are not really production code? 2) If I still want to check the type of my arguments, do I a) use type() or is instance() to check for type? b) use assert (I guess not), raise a ValueError, or sys.exit()? (I noticed that raising a ValueError does not stop execution when I am running the Interactive Interpreter under PTVS, which I find inconvenient, but it does stop execution when running the code non-interactively.) Thanks. FS -- http://mail.python.org/mailman/listinfo/python-list
Re: Check for the type of arguments
Thanks for the insightful answers. PTVS is Python Tools for Visual Studio. -- http://mail.python.org/mailman/listinfo/python-list
Python getters and setters
I am new to Python. I understand that it is unpythonic to write getters and setters, and that property() can be used if necessary. This deals with the case of attributes, but there are other kinds of information available within a class. Suppose my class contains an attribute called data that can potentially provide a lot of information that will be needed by class users. I have two options: 1) For each piece of information within data (e.g., length) I write a method that retrieves that information: def data_length(self): return len(self.data) 2) I do not create such a method. Users that are interested in that information will have to write len(obj.data), where obj is a previously instantiated object of my class. Which one of the two alternatives fits better with the Python philosophy? The first alternative is more work for me, creates a heavier class and may have slower performance, but makes things easier for the user and is more implementation independent. Thanks for the help. FS -- http://mail.python.org/mailman/listinfo/python-list
Re: Python getters and setters
The debate got more interesting than I expected. Without taking sides, I would like to add that perhaps my length example was misleading: length is easy to calculate. The information could be hard to extract from the data, either because complex calculations are involved, or because it is not apparent which pieces of information have to be combined. Also, notice that I never stated that the information was in the shape of an attribute. The comparison below may seem clear cut if the information to be obtained is length but one could arrive at exactly the opposite conclusion by imagining that length is hard to calculate or extract. It may be easier and faster to look up a function in an API than figuring out how to do a complex calculation. Even in the simple case where the information is length there may be more than one reasonable definition. For example, when dealing with time series data in matrix form it makes sense to consider the number of rows as the length of the data, but it also makes sense to define length as the number of elements of the matrix. So the word len in the first half of the example below could hide different concepts: consistent but misleading. The different methods in the second half could have the virtue of clarifying which concept of length applies in each case. # Yes, this is good, consistent design len(myrecord.field) len(obj.data) len(data.value) len(collection[key]) # No, this is crappy, inconsistent design myrecord.field_len() obj.data_length() data.get_length_of_value() collection.key_len(key) -- http://mail.python.org/mailman/listinfo/python-list
how to use property?
Hi guys! I'm noob in python and I would know how to correctly use the property. I have read some things about it but I do not quite understand. I found this: class C(object): def __init__(self): self._x = None @property def x(self): I'm the 'x' property. return self._x @x.setter def x(self, value): self._x = value @x.deleter def x(self): del self._x But I think it's a bad habit to use _ to change the visibility of the attributes as in JAVA. How to correctly use the property? -- http://mail.python.org/mailman/listinfo/python-list
A sad day for the scientific Python community. John Hunter, creator of matplotlib: 1968-2012.
difficult technical problems in matplotlib, teach courses and seminars about scientific Python, and more recently help create the NumFOCUS foundation project. Despite the challenges that raising three children in an expensive city like Chicago presented, he never once wavered from his commitment to open source. But unfortunately now he is not here anymore to continue providing for their well-being, and I hope that all those who have so far benefited from his generosity, will thank this wonderful man who always gave far more than he received. Thanks to the rapid action of Travis Oliphant, the NumFOCUS foundation is now acting as an escrow agent to accept donations that will go into a fund to support the education and care of his wonderful girls Rahel, Ava and Clara. If you have benefited from John's many contributions, please say thanks in the way that would matter most to him, by helping Miriam continue the task of caring for and educating Rahel, Ava and Clara. You will find all the information necessary to make a donation here: http://numfocus.org/johnhunter Remember that even a small donation helps! If all those who ever use matplotlib give just a little bit, in the long run I am sure that we can make a difference. If you are a company that benefits in a serious way from matplotlib, remember that John was a staunch advocate of keeping all scientific Python projects under the BSD license so that commercial users could benefit from them without worry. Please say thanks to John in a way commensurate with your resources (and check how much a yearly matlab license would cost you in case you have any doubts about the value you are getting...). John's family is planning a private burial in Tennessee, but (most likely in September) there will also be a memorial service in Chicago that friends and members of the community can attend. We don't have the final scheduling details at this point, but I will post them once we know. I would like to again express my gratitude to Travis Oliphant for moving quickly with the setup of the donation support, and to Eric Jones (the founder of Enthought and another one of the central figures in our community) who immediately upon learning of John's plight contributed resources to support the family with everyday logistics while John was facing treatment as well as my travel to Chicago to assist. This kind of immediate urge to come to the help of others that Eric and Travis displayed is a hallmark of our community. Before closing, I want to take a moment to publicly thank the incredible staff of the University of Chicago medical center. The last two weeks were an intense and brutal ordeal for John and his loved ones, but the hospital staff offered a sometimes hard to believe, unending supply of generosity, care and humanity in addition to their technical competence. The latter is something we expect from a first-rate hospital at a top university, where the attending physicians can be world-renowned specialists in their field. But the former is often forgotten in a world often ruled by a combination of science and concerns about regulations and liability. Instead, we found generous and tireless staff who did everything in their power to ease the pain, always putting our well being ahead of any mindless adherence to protocol, patiently tending to every need we had and working far beyond their stated responsibilities to support us. To name only one person (and many others are equally deserving), I want to thank Dr. Carla Moreira, chief surgical resident, who spent the last few hours of John's life with us despite having just completed a solid night shift of surgical work. Instead of resting she came to the ICU and worked to ensure that those last hours were as comfortable as possible for John; her generous actions helped us through a very difficult moment. It is now time to close this already too long message... John, thanks for everything you gave all of us, and for the privilege of knowing you. Fernando. -- http://mail.python.org/mailman/listinfo/python-list
[ANN] IPython 0.13 is officially out!
Hi all, on behalf of the IPython development team, and just in time for the imminent Debian freeze and SciPy 2012, I'm thrilled to announce, after an intense 6 months of work, the official release of IPython 0.13. This version contains several major new features, as well as a large amount of bug and regression fixes. The previous version (0.12) was released on December 19 2011, so in this development cycle we had: - ~6 months of work. - 373 pull requests merged. - 742 issues closed (non-pull requests). - contributions from 62 authors. - 1760 commits. - a diff of 114226 lines. This means that we closed a total of 1115 issues over 6 months, for a rate of almost 200 issues closed and 300 commits per month. We are very grateful to all of you who have contributed so enthusiastically to the project and have had the patience of pushing your contributions through our often lengthy review process. We've also welcomed several new members to the core IPython development group: Jörgen Stenarson (@jstenar - this really was an omission as Jörgen has been our Windows expert for a long time) and Matthias Bussonier (@Carreau), who has been very active on all fronts of the project. *Highlights* There is too much new work to write up here, so we refer you to our What's New document (http://ipython.org/ipython-doc/rel-0.13/whatsnew/version0.13.html) for the full details. But the main highlights of this release are: * Brand new UI for the notebook, with major usability improvements (real menus, toolbar, and much more) * Manage all your parallel cluster configurations from the notebook with push-button simplicity (cluster start/stop with one button). * Cell magics: commands prefixed with %% apply to an entire cell. We ship with many cell magics by default, including timing, profiling, running cells under bash, Perl and Ruby as well as magics to interface seamlessly with Cython, R and Octave. * The IPython.parallel tools have received many fixes, optimizations, and a number of API improvements to make writing, profiling and debugging parallel codes with IPython much easier. * We have unified our interactive kernels (the basic ipython object you know and love) with the engines running in parallel, so that you can now use all IPython special tricks in parallel too. And you can connect a console or qtconsole to any parallel engine for direct, interactive execution, plotting and debugging in a cluster. *Downloads* Downloads can be found on: - Github: http://github.com/ipython/ipython/downloads - PyPI: http://pypi.python.org/pypi/ipython More download/install details: http://ipython.org/download.html. Please see our release notes for the full details on everything about this release: http://ipython.org/ipython-doc/rel-0.13/whatsnew/version0.13.html As usual, if you find any other problem, please file a ticket --or even better, a pull request fixing it-- on our github issues site (https://github.com/ipython/ipython/issues). Many thanks to all who contributed! Fernando, on behalf of the IPython development team. http://ipython.org -- http://mail.python.org/mailman/listinfo/python-list
[issue10109] itertools.product with infinite iterator cause MemoryError.
Sumudu Fernando sumu...@gmail.com added the comment: I don't agree with the response to this. It is true that as implemented (at least in 2.7, I don't have 3.x handy to check) itertools.product requires finite iterables. However this seems to be simply a consequence of the implementation and not part of the spirit of the function, which as falsetru pointed out is stated to be equivalent to nested for-loops in a generator expression. Indeed, implementing product in Python (in a recursive way) doesn't have this problem. Perhaps a more convincing set of testcases to show why this could be considered a problem: import itertools itertools.product(xrange(100)) itertools.product object at 0xb7ed334c itertools.product(xrange(100)) itertools.product object at 0xb7ed620c itertools.product(xrange(10)) Traceback (most recent call last): File stdin, line 1, in module MemoryError Note that I'm not even using an infinite iterable, just a really big one. The issue is that creating the iterator fails with a MemoryError, before I've even asked for any values. Consider the following: for (i, v) in enumerate(itertools.product(a, b, c)): if i 1000: print v else: break When a, b, and c are relatively small, finite iterables, this code works fine. However, if *any* of them are too large (or infinite), we see a MemoryError before the loop even starts, even though only 1000 elements are required. I think it's conceivable that we might want something like a = itertools.cycle(xrange(5)), and even that will break this loop. That said, in all such cases I could think of, we can always either truncate big iterators before passing them to product, or use zip/comprehensions to add their values into the tuple (or some combination of those). So maybe it isn't a huge deal. I've attached my implementation of product which deals with infinite iterators by leveraging enumerate and itertools.cycle, and is pretty much a direct translation of the odometer idea. This doesn't support the repeat parameter (but probably could using itertools.tee). One thing that should be changed is itertools.cycle shouldn't be called / doesn't need to be called on infinite iterators, but I couldn't figure out how to do that. Maybe there is some way to handle it in the C implementation?) In summary: the attached implementation of product can accept any mix of infinite / finite iterators, returning a generator intended for partial consumption. The existing itertools.product doesn't work in this case. -- nosy: +Sumudu.Fernando Added file: http://bugs.python.org/file24270/product.py ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue10109 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue10109] itertools.product with infinite iterator cause MemoryError.
Sumudu Fernando sumu...@gmail.com added the comment: tuple(itertools.cycle(enumerate(it)) for it in itertools.count()) ... TypeError: 'int' object is not iterable That is not what happens in the function, though! That would correspond to doing product(*itertools.count(2010)), but if you try that you won't even get past argument expansion (obviously). Doing product(*xrange(10)) gives the error you're talking about, for example. product(itertools.count(2010)) works perfectly well with the version I posted, though it is a bit silly to do it that way since it produces the same values as count itself (which is what cartesian product should do), while saving extra bookkeeping along the way. Anyway, I'm pretty new to python and I don't think this is quite relevant enough to warrant opening a new ticket. I'm happy to leave it here for the education of the next neophyte who stumbles across this idiosyncracy of itertools.product. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue10109 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Re: Python education survey
On Mon, 19 Dec 2011 19:51:00 -0800, Raymond Hettinger wrote: Do you use IDLE when teaching Python? If not, what is the tool of choice? I'm obviously biased (I started IPython years ago), but I've done a lot of teaching and I still do like the combination of IPython plus an editor. Sometimes I use IDLE configured to only open the editor and not the shell, but I recommend that users learn a 'real' editor for the long run (aka emacs/vim), as it's an investment that will pay off many times over. But if nothing else, there's at least an OK free editor for each OS that does work, and I keep a 'starter kit' page with those resources for my students: http://fperez.org/py4science/starter_kit.html Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
[ANN] IPython 0.12 is out!
Hi all, on behalf of the IPython development team, I'm thrilled to announce, after an intense 4 1/2 months of work, the official release of IPython 0.12. This is a very important release for IPython, for several reasons. First and foremost, we have a major new feature, our interactive web-based notebook, that has been in our sights for a very long time. We tried to build one years ago (with WX) as a Google SoC project in 2005, had other prototypes later on, but things never quite worked. Finally the refactoring effort started two years ago, the communications architecture we built in 2010, and the advances of modern browsers, gave us all the necessary pieces. With this foundation in place, while part of the team worked on the 0.11 release, Brian Granger had already started quietly building the web notebook, which we demoed in early-alpha mode at the SciPy 2011 conference (http://www.archive.org/details/Wednesday-203-6- IpythonANewArchitectureForInteractiveAndParallel). By the EuroScipy conference in August we had merged Brian's amazing effort into our master branch, and after that multiple people (old and new) jumped in to make all kinds of improvements, leaving us today with something that is an excellent foundation. It's still the first release of the notebook, and as such we know it has a number of rough edges, but several of us have been using it as a daily research tool for the last few months. Do not hesitate to file issues for any problems you encounter with it, and we even have an 'open issue' for general discussion of ideas and features for the notebook at: https://github.com/ipython/ipython/issues/977. Furthermore, it is clear that our big refactoring work, combined with the amazing facilities at Github, are paying off. The 0.11 series was a major amount of work, with 511 issues closed over almost two years. But that pales in comparison to this cycle: in only 4 1/2 months we closed 515 issues, with 50% being Pull Requests. And very importantly, our list of contributors includes many new faces (see the credits section in our release notes for full details), which is the best thing that can happen to an open source project. We hope you will find the new features (the notebook isn't the only one! see below) compelling, and that many more will not only use IPython but will join the project; there's plenty to do and now there are tasks for many different skill sets (web, javascript, gui work, low-level networking, parallel machinery, console apps, etc). *Downloads* Download links and instructions are at: http://ipython.org/download.html And IPython is also on PyPI: http://pypi.python.org/pypi/ipython Those contain a built version of the HTML docs; if you want pure source downloads with no docs, those are available on github: Tarball: https://github.com/ipython/ipython/tarball/rel-0.12 Zipball: https://github.com/ipython/ipython/zipball/rel-0.12 * Features * Here is a quick listing of the major new features: - An interactive browser-based Notebook with rich media support - Two-process terminal console - Tabbed QtConsole - Full Python 3 compatibility - Standalone Kernel - PyPy support And many more... We closed over 500 tickets, merged over 200 pull requests, and more than 45 people contributed commits for the final release. Please see our release notes for the full details on everything about this release: http://ipython.org/ipython-doc/stable/whatsnew/version0.12.html * IPython tutorial at PyCon 2012 * Those of you attending (or planning on it) PyCon 2012 in Santa Clara, CA, may be interested in attending a hands-on tutorial we will be presenting on the many faces of IPython. See https://us.pycon.org/2012/schedule/ presentation/121/ for full details. * Errata * This was caught by Matthias Bussionnier's (one of our great new contributors) sharp eyes while I was writing these release notes: In the example notebook called display_protocol, the first cell starts with: from IPython.lib.pylabtools import print_figure which should instead be: from IPython.core.pylabtools import print_figure This has already been fixed on master, but since the final 0.12 files have been uploaded to github and PyPI, we'll let them be. As usual, if you find any other problem, please file a ticket --or even better, a pull request fixing it-- on our github issues site (https:// github.com/ipython/ipython/issues/). Many thanks to all who contributed! Fernando, on behalf of the IPython development team. http://ipython.org -- http://mail.python.org/mailman/listinfo/python-list
Re: IPython 0.12 is out!
On Mon, 19 Dec 2011 20:00:03 -0800, alex23 wrote: You read the installation instructions and did a 'python setup.py install' as it states, yes? Installed that way for Python 2.7.2 under Win64 with no issues whatsoever. Glad to hear that. Obviously since I announced it here I'll try to monitor this thread, but if the OP is still having problems with his installation, please don't hesitate to post a question on our user mailing list: http://mail.scipy.org/mailman/listinfo/ipython-user where other users and developers may be able to assist (I'll be on vacation shortly so I may not be able to reply here, hence the chances for effective help will be higher on the list). We had no reports of problems with Windows installations during the beta and RC, so we'd love to know what's causing any issues here to fix them for the next release. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
[ANN] IPython 0.11 is officially out
Hi all, on behalf of the IPython development team, I'm thrilled to announce, after more than two years of development work, the official release of IPython 0.11. This release brings a long list of improvements and new features (along with hopefully few new bugs). We have completely refactored IPython, making it a much more friendly project to participate in by having better separated and organized internals. We hope you will not only use the new tools and libraries, but also join us with new ideas and development. After this very long development effort, we hope to make a few stabilization releases at a quicker pace, where we iron out the kinks in the new APIs and complete some remaining internal cleanup work. We will then make a (long awaited) IPython 1.0 release with these stable APIs. *Downloads* Download links and instructions are at: http://ipython.org/download.html And IPython is also on PyPI: http://pypi.python.org/pypi/ipython Those contain a built version of the HTML docs; if you want pure source downloads with no docs, those are available on github: Tarball: https://github.com/ipython/ipython/tarball/rel-0.11 Zipball: https://github.com/ipython/ipython/zipball/rel-0.11 * Features * Here is a quick listing of the major new features: - Standalone Qt console - High-level parallel computing with ZeroMQ - New model for GUI/plotting support in the terminal - A two-process architecture - Fully refactored internal project structure - Vim integration - Integration into Microsoft Visual Studio - Improved unicode support - Python 3 support - New profile model - SQLite storage for history - New configuration system - Pasting of code with prompts And many more... We closed over 500 tickets, merged over 200 pull requests, and more than 60 people contributed over 2200 commits for the final release. Please see our release notes for the full details on everything about this release: https://github.com/ipython/ipython/zipball/rel-0.11 * Resources * You can see a talk about this release that was presented at the Scipy 2011 conference: http://www.archive.org/details/Wednesday-203-6- IpythonANewArchitectureForInteractiveAndParallel For reference, the slides that go along with it are here: http://fperez.org/talks/1107_ipython_scipy.pdf And there's an excellent blog post, written by Chris Fonnesbeck, providing a visual tour of our new features: http://stronginference.com/weblog/2011/7/15/innovations-in-ipython.html As usual, if you find any problem, please file a ticket --or even better, a pull request fixing it-- on our github issues site (https://github.com/ipython/ipython/issues/). Many thanks to all who contributed! Fernando, on behalf of the IPython development team. http://ipython.org -- http://mail.python.org/mailman/listinfo/python-list
[issue10144] Buffering bug after calling curses function
Fernando Perez fdo.pe...@gmail.com added the comment: No problem for us (IPython) if you mark it as won't fix. I've just applied the environment workaround you guys suggested: http://github.com/ipython/ipython/commit/147b245d2ead0e15d2c17b7bb760a03126660fb7 Thanks a lot for that tip! That leaves us in good shape. -- nosy: +fperez ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue10144 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue1170] shlex have problems with parsing unicode
Fernando Perez fdo.pe...@gmail.com added the comment: Yes, sorry that I failed to mention the example I gave applies only to 2.x, not to 3.x. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue1170 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue1170] shlex have problems with parsing unicode
Fernando Perez fdo.pe...@gmail.com added the comment: On Tue, Jul 27, 2010 at 11:52, Alexander Belopolsky rep...@bugs.python.org wrote: Why do you expect shlex to work with unicode in 2.x? =A0The documentation clearly says that the argument should be a string. Supporting unicode is not an unreasonable RFE, but won't be considered for 2.x anymore. Well, I didn't make the original report, just provided a short, illustrative example :) It's easy enough to work around the issue for 2.x that I don't care too much about it, so I have no problem with 2.x staying as it is. What's your take on accepting bytes in 3.x? Mmh... Not too sure. I'd think about it from the perspective of what possible sources of input could produce raw bytes, that would be reasonable use cases for shlex. Is it common in 3.x to read a file in bytes mode? If so, then it might be a good reason to have shlex parse bytes as well, since I can imagine reading inputs from files to be parsed via shlex. But take my opinion on 3.x with a big grain of salt, I have very little experience with it as of yet. Cheers, f -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue1170 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue1170] shlex have problems with parsing unicode
Fernando Perez fdo.pe...@gmail.com added the comment: Here is an illustration of the problem with a simple test case (the value of the posix flag doesn't make any difference): Python 2.6.5 (r265:79063, Apr 16 2010, 13:09:56) [GCC 4.4.3] on linux2 Type help, copyright, credits or license for more information. import shlex list(shlex.shlex('ab')) ['ab'] list(shlex.shlex(u'ab', posix=True)) ['a', '\x00', '\x00', '\x00', 'b', '\x00', '\x00', '\x00'] list(shlex.shlex(u'ab', posix=False)) ['a', '\x00', '\x00', '\x00', 'b', '\x00', '\x00', '\x00'] -- nosy: +fperez ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue1170 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue7897] Support parametrized tests in unittest
Fernando Perez fdo.pe...@gmail.com added the comment: Hey Yarick, On Thu, Apr 8, 2010 at 18:53, Yaroslav Halchenko rep...@bugs.python.org w= rote: In PyMVPA we have our little decorator as an alternative to Fernando's ge= nerators, =A0and which is closer, I think, to what Michael was wishing for: @sweepargs http://github.com/yarikoptic/PyMVPA/blob/master/mvpa/testing/sweepargs.py NB it has some minor PyMVPA specificity which could be easily wiped out, = and since it was at most 4 eyes looking at it and it bears evolutionary c= hanges, it is far from being the cleanest/best piece of code, BUT: * it is very easy to use, just decorate a test method/function and give a= n argument which to vary within the function call, e.g smth like @sweepargs(arg=3Drange(5)) def test_sweepargs_demo(arg): =A0 =A0ok_(arg 5) =A0 =A0ok_(arg 3) =A0 =A0ok_(arg 2) For nose/unittest it would still look like a single test Thanks for the post; I obviously defer to Michael on the final decision, but I *really* would like a solution that reports an 'argument sweep' as multiple tests, not as one. They are truly multiple tests (since they can pass/fail independently), so I think they should be treated as such. On the other hand, your code does have nifty features that could be used as well, so perhaps the best of both can be used in the end. Cheers, f -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue7897 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue7897] Support parametrized tests in unittest
Fernando Perez fdo.pe...@gmail.com added the comment: Yarick: Yes, I do actually see the value of the summary view. When I have a parametric test that fails, I tend to just run nose with -x so it stops at the first error and with the --pdb options to study it, so I simply ignore all the other failures. To me, test failures are quite often like compiler error messages: if there's a lot of them, it's best to look only at the first few, fix those and try again, because the rest could be coming from the same cause. I don't know if Michael has plans/bandwidth to add the summary support as well, but I agree it would be very nice to have, just not at the expense of individual reporting. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue7897 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
saving a TIFF
I'm trying to save an image created from two arrays (I'm using numpy and PIL): n=(b4a-b3a)/(b4a+b3a); ndvi = Image.fromarray(n) ndvi.save(F:\Fire_scar_mapping\ILS3\ndvi_test,TIFF) ...but I get the following error message: IOError: [Errno 22] invalid mode ('wb') or filename: 'F:\\Fire_scar_mapping\\ILS3\ndt' Could anybody tell me what I'm doing wrong? -- http://mail.python.org/mailman/listinfo/python-list
[issue7897] Support parametrized tests in unittest
Fernando Perez fdo.pe...@gmail.com added the comment: I should probably have clarified better our reasons for using this type of code. The first is the one Michael pointed out, where such parametric tests all execute; it's very common in scientific computing to have algorithms that only fail for certain values, so it's important to identify these points of failure easily while still running the entire test suite. The second is that the approach nose uses produces on failure the nose stack, not the stack of the test. Nose consumes the test generators at test discovery time, and then simply calls the stored assertions at test execution time. If a test fails, you see a nose traceback which is effectively useless for debugging and with which using --pdb for interactive debugging doesn't help much (all you can do is print the values, as your own stack is gone). This code, in contrast, evaluates the full test at execution time, so a failure can be inspected 'live'. In practice this makes an enormous difference in a test suite being actively useful for ongoing development where changes may send you into debugging often. I hope this helps clarify the intent of the code better, I'd be happy to provide further details. -- ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue7897 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue7897] Support parametrized tests in unittest
New submission from Fernando Perez fdo.pe...@gmail.com: IPython has unittest-based parametric testing (something nose has but which produces effectively undebuggable tests, while this approach gives perfectly debuggable ones). The code lives here for 2.x and 3.x: http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/IPython/testing/_paramtestpy2.py http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/IPython/testing/_paramtestpy3.py we import them into our public decorators module for actual use: http://bazaar.launchpad.net/~ipython-dev/ipython/trunk/annotate/head%3A/IPython/testing/decorators.py Simple tests showing them in action are here: http://bazaar.launchpad.net/%7Eipython-dev/ipython/trunk/annotate/head%3A/IPython/testing/tests/test_decorators.py#L45 The code is all BSD and we'd be more than happy to see it used upstream; the less we have to carry the better. If there is interest in this code, I'm happy to sign a PSF contributor agreement, the code is mostly my authorship. I received help for the 3.x version on the Testing in Python mailing list, I would handle asking for permission on-list if there is interest in including this. -- messages: 99149 nosy: fperez severity: normal status: open title: Support parametrized tests in unittest type: feature request versions: Python 2.7, Python 3.2 ___ Python tracker rep...@bugs.python.org http://bugs.python.org/issue7897 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Trouble with subprocess.Popen()
Hi, I'm having a very strange issue with subprocess.Popen. I'm using it to call several times an external exe and keep the output in a list. Every time you call this external exe, it will return a different string. However, if I call it several times using Popen, it will always return the SAME string. =:-O It looks like Popen is returning always the same value from stdout, without recalling the exe. This is my code: def get_key(): from subprocess import Popen, PIPE args = [C_KEY_MAKER, '/26', USER_NAME, ENCRYPTION_TEMPLATE, '0', ] process = Popen(args, stdout=PIPE) output = process.communicate()[0].strip() return output print get_key() # Returns a certain string print get_key() # Should return another string, but returns the same! - What on Earth am I doing wrong?! -- http://mail.python.org/mailman/listinfo/python-list
logging and daemons
Hello. I'm in the process of replacing a custom logger class in one of my apps that has several daemons. In the last step of daemonizing a program, after closing fds, stderr and stdout are redirected to the logfile of the program. Now, I'm trying to use TimedRotatingFileHandler as the only channel when the programs run in daemon mode. My problem is: I can't see a goog way to redirect stderr/stdout both to the logger. Note that I don't have any print statements in any of my code, but I can't be sure about all the modules I'm importing, and I like to get any uncached exception info that may go to stderr/stdout to show up in the logfiles. Any ideas? Thanks a lot, -- Fernando signature.asc Description: Digital signature -- http://mail.python.org/mailman/listinfo/python-list
Re: logging and daemons
Hello, thanks for the answer. On Mon, Feb 16, 2009 at 05:07:45AM -0800, Garrett Cooper wrote: You can actually set sys.std[err|out] to your ?file? descriptor of choice in python (it has to have read, write, and flush methods, IIRC to function). The only thing is (like all things dealing with multiple file descriptors like std[err|out]) the output may come in out of order due to flushing and insertion into the buffers, but it shouldn't be as much of an issue considering that the file descriptor for both items is the same descriptor, but this is just a note of forewarning. Yes, but I'm trying to use *TimedRotating*FileHandler, which makes the fd of the logfile change in every rotation of the logfile. So the direct approach of std[out|err] redirection to the logfile fd obtained from the logger instance is unusable (it works fine with a simple file handler). I'm looking into this because I really need rotating, because when debugging is on, large amounts of data are logged, and because I like the logging module approach in every aspect. Also, may it be possible to derive the class and add a file-like write method? Anyone using logging in this manner? Cheers, -- Fernando signature.asc Description: Digital signature -- http://mail.python.org/mailman/listinfo/python-list
Re: logging and daemons
On Mon, Feb 16, 2009 at 10:11:51AM -0800, Scott David Daniels wrote: Fernando M. Maresca wrote: On Mon, Feb 16, 2009 at 05:07:45AM -0800, Garrett Cooper wrote: You can actually set sys.std[err|out] to your ?file? descriptor of choice in python Yes, but I'm trying to use *TimedRotating*FileHandler, which makes the fd of the logfile change in every rotation of the logfile. So the direct approach of std[out|err] redirection to the logfile fd obtained from the logger instance is unusable (it works fine with a simple file handler). Right, so you stick something in as stderr/stdout that talks to the logfile. That is, something like: import sys class MyFakeStdout(object): def flush(self): pass def write(self, text): code to actually do the logging sys.stderr = sys.stdout = MyFakeStdout() Thanks a lot for the input. That's pretty much what I'm doing right now, just wondered if were a cleanest way. Cheers, -- Fernando signature.asc Description: Digital signature -- http://mail.python.org/mailman/listinfo/python-list
[issue1168055] Add current dir when running try_run test program
Fernando Gomes fgs...@gmail.com added the comment: Dear Community, I'd fronted problems to install Statistics module on my Ubuntu. There are errors that are hard to solve without an expert help, such as the one received from Michiel (Thank you again!!!). Errors are listed below: # XXX o...@fgsj-ima:/usr/lib/python2.5/statistics-0.19# python setup.py config running config = begin statistics configuration unable to execute _configtest: No such file or directory libm does not contain erf; skipping this feature unable to execute _configtest: No such file or directory libm does not contain erfc; skipping this feature = end statistics configuration == wrote config file (src/config.h) r...@fgsj-ima:/usr/lib/python2.5/statistics-0.19# python setup.py build running build running build_ext building 'statistics' extension gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -Isrc -I/usr/lib/python2.5/site-packages/numpy/core/include -I/usr/include/python2.5 -c src/statisticsmodule.c -o build/temp.linux-i686-2.5/src/statisticsmodule.o src/statisticsmodule.c:1:20: error: Python.h: No such file or directory In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from src/statisticsmodule.c:2: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:100:2: error: #error Must use Python with unicode enabled. In file included from /usr/lib/python2.5/site-packages/numpy/core/include/numpy/arrayobject.h:14, from src/statisticsmodule.c:2: /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:920: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:921: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘npy_uintp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1013: error: expected ‘=’, ‘,’, ‘;’, ‘asm’ or ‘__attribute__’ before ‘*’ token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1014: error: expected ‘)’ before ‘*’ token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1016: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1016: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1017: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1027: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1027: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1029: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1029: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1030: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1032: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1037: error: expected ‘)’ before ‘*’ token /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1042: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1044: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1045: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1045: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1047: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1051: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1053: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1054: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr/lib/python2.5/site-packages/numpy/core/include/numpy/ndarrayobject.h:1055: error: expected declaration specifiers or ‘...’ before ‘npy_intp’ /usr
Re: RELEASED Python 3.0 final
On Dec 4, 5:45 pm, Andreas Waldenburger [EMAIL PROTECTED] wrote: On Thu, 4 Dec 2008 11:52:38 -0600 [EMAIL PROTECTED] wrote: As you have probably guessed: nothing changed here. Also see:http://www.python.org/dev/peps/pep-0666/ What? Do you mean it's possible to mix tabs and spaces still? Why? Daniel Why not? Because it has historically been a source of errors in a mixed development environment (people using text editors with different tab stops). Better to not allow them to be mixed. Whenever has it been a pythonic ideal to not allow stuff? You get warnings. Everything else is up to you. /W -- My real email address is constructed by swapping the domain with the recipient (local part). Python has not allowed stuff for a long time. For example, it disallows statements in lambdas. Disallowing is not bad. Disallowing bad practices (like mixing tabs and spaces) is actually good! I agree that the tab/space thing should be changed. Would it be too hard to make the parser see if the indentation is consistent in the whole file? This is a annoying source of problems, specially since you can't tell a whitespace from a tab just looking at it. And I personally disliked most of the changes (specially the ones on map and reduce). I hope functional programming doesn't get even more hindered in future releases, because I believe these changes only made Python weaker. Well, anyway, congratulations for everyone for Python 3 release. Some of the changes were a real improvement (like the Unicode sources). And I hope that, in the end, these changes help making Python a better language. -- http://mail.python.org/mailman/listinfo/python-list
Re: Pyhon (with wxPython) on Windows' cygwin: can it be done fully ?
At first I also disliked print's new syntax, but later I realised it could be useful. However, I agree that the parentheses are annoying. Not because of the parens theirselves, but because of the Shift key. Why programmers stilll can't have special keyboards with parens keys that doesn't need pressing shift? Isn't time C programmers have a key and perl programmers a $ one? And why the heck we need shift for ( and not for [ or {, since the first one is much more used (even outside programming)? Really, we don't need to change our syntax, we need to change our keyboards. We are so blinded by tradition that we are losing productivity. -- http://mail.python.org/mailman/listinfo/python-list
[issue4287] Broken URL in documentation style guide
New submission from Fernando Correia [EMAIL PROTECTED]: The documentation Style Guide [http://docs.python.org/dev/documenting/style.html] has a broken link to http://developer.apple.com/documentation/UserExperience/Conceptual/APStyleGuide/AppleStyleGuide2006.pdf. This link should be updated to: http://developer.apple.com/documentation/UserExperience/Conceptual/APStyleGuide/APSG_2008.pdf -- assignee: georg.brandl components: Documentation messages: 75649 nosy: facorreia, georg.brandl severity: normal status: open title: Broken URL in documentation style guide type: feature request versions: Python 2.7 ___ Python tracker [EMAIL PROTECTED] http://bugs.python.org/issue4287 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Re: from package import * without overwriting similarly named functions?
Also, remember that since the latter functions will always overwrite the first, you can just reverse the order of the imports: from package2 import * from package1 import * This should preserve the functions of package1 over the other ones. -- http://mail.python.org/mailman/listinfo/python-list
[issue3158] Doctest fails to find doctests in extension modules
Fernando Pérez [EMAIL PROTECTED] added the comment: I think there are two issues that need to be separated: 1. The doctest bug. I'm happy with any resolution for it, and I'm not claiming that my patch is the best approach. isroutine() indeed works in my case, and if that approach works well in general for doctest, I'm perfectly happy with it. 2. Terminology. I really disagree with the idea that - 'function' describes the implementation language of an object instead of whether it's a standalone callable (vs an object method). - 'builtin' doesn't mean the object is built into the shipped Python but instead that it's written in C. The language exposes its builtins via the __builtin__ module precisely to declare what is part of itself, and it even has in the documentation: http://docs.python.org/lib/built-in-funcs.html a section that starts: 2.1 Built-in Functions The Python interpreter has a number of functions built into it that are always available. Nowhere does it say that builtins are written in C and functions in Python. In summary, I'm happy with any fix for the bug, but I very strongly disagree with a use of terminology that is confusing and misleading (and which unfortunately is enshrined in the inspect and types modules in how they distinguish 'Function' from 'BuiltinFunctionType'). And by the way, by 'extension module' I mean to describe C-extensions, since that is how most C code is shipped by third-party authors, those affected by this bug (since the stdlib doesn't seem to use doctests itself for its own testing of C code). ___ Python tracker [EMAIL PROTECTED] http://bugs.python.org/issue3158 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue3158] Doctest fails to find doctests in extension modules
New submission from Fernando Pérez [EMAIL PROTECTED]: Doctest fails to find doctests defined in extension modules. With tools like cython (http://cython.org) it's trivially easy to add docstrings to extension code, a task that is much less pleasant with hand-written extensions. The following patch is a minimal fix for the problem: --- doctest_ori.py 2008-06-20 19:22:56.0 -0700 +++ doctest.py 2008-06-20 19:23:53.0 -0700 @@ -887,7 +887,8 @@ for valname, val in obj.__dict__.items(): valname = '%s.%s' % (name, valname) # Recurse to functions classes. -if ((inspect.isfunction(val) or inspect.isclass(val)) and +if ((inspect.isfunction(val) or inspect.isclass(val) or + inspect.isbuiltin(val) ) and self._from_module(module, val)): self._find(tests, val, valname, module, source_lines, globs, seen) However, it is likely not sufficient as it doesn't take into account the __test__ dict, for which probably the same change would work, just a few lines later. Furthermore, the real issue is in my view in the distinction made by inspect between isfunction() and isbuiltin() for the sake of analyzing docstrings. isfunction() returns false for a function that is defined in an extension module (though it *is* a function) while isbuiltin returns True (though it is *not* a builtin). For purposes of doctesting, doctest should simply care: - That it is a function. - That it has a docstring that can be parsed. But in too many places in doctest there are currently assumptions about being able to extract full source, line numbers, etc. Hopefully this quick fix can be applied as it will immediately make doctest work with large swaths of extension code, while a proper rethinking of doctest is made. BTW, in that process doctest will hopefully be made more modular and flexible: its current structure forces massive copy/paste subclassing for any kind of alternate use, since it has internally hardwired use of its own classes. Doctest is tremendously useful, but it really could use with some structural reorganization to make it more flexible (cleanly). -- components: Library (Lib) messages: 68489 nosy: fer_perez severity: normal status: open title: Doctest fails to find doctests in extension modules versions: Python 2.5 ___ Python tracker [EMAIL PROTECTED] http://bugs.python.org/issue3158 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Re: Avoiding redirects with urllib
Hello [EMAIL PROTECTED], import urllib url_opener = urllib.URLopener() # create URLopener #You could also work with urllib.FancyURLopener try: data = url_opener.open(http://www.somedomain.com/index.html;) # open index.html except IOError, error_code: if error_code[0] == http error: if error_code[1] == 301: #do something here if error_code[2] == 302: #do something here I hope that's of some help! I think you may want to delve deeper into FancyURLopener... The problem is that I'm using a subclass of FancyOpener and it doesn't raise those exceptions. -- http://mail.python.org/mailman/listinfo/python-list
Re: Avoiding redirects with urllib
Hello Fernando, I hope that's of some help! I think you may want to delve deeper into FancyURLopener... The problem is that I'm using a subclass of FancyOpener and it doesn't raise those exceptions. Ok, forget it, I should have read the fine manual. O:-) -- http://mail.python.org/mailman/listinfo/python-list
Avoiding redirects with urllib
Hi, I'musing urllib to download pages from a site. How can I detect if a given url is being redirected somewhere else? I want to avoid this, is it possible? Thanks in advance! -- http://mail.python.org/mailman/listinfo/python-list
[issue2657] Curses sometimes fails to initialize terminal
Fernando Pérez [EMAIL PROTECTED] added the comment: As reported by Ondrej Certik on the IPython mailing list: Here is how to reliably (100%) reproduce it in ipython 0.8.2 (the bug indeed goes away in 0.8.4): http://code.google.com/p/sympy/issues/detail?id=822 together with a screenshot how the terminal looks like (see the comment #6 for the exact sympy revision to use). Maybe you could use it to track the bug down in curses, as your patch only seems to be a workaround (albeit working). Ondrej /quote While unfortunately right now I don't have the time to try and whittle this down to a smaller, self-contained example, it's great to at least have a guaranteed reproducible way of triggering the bug. It requires installing specific versions of both ipython and sympy, but that's very straightforward to do, as both are pure-python projects with no dependencies outside the stdlib. ___ Python tracker [EMAIL PROTECTED] http://bugs.python.org/issue2657 ___ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Error when calling superclass __init__ method
Hi, I'm getting an odd error while trying to call the __init__ method of a super class: BaseField.__init__(self) TypeError: unbound method __init__() must be called with BaseField instance as first argument (got nothing instead) This is the code: class BaseField(object): def _addFieldsToRec(self, rec, *fields): for field in fields: self.mfn[field] = rec def __init__(self): self._addFieldsToRec(1,1) self._addFieldsToRec(2, 500,501,502,503,504,505,506,507,508) class Field(BaseField): def __init__(self, value): BaseField.__init__(self) # this seems to be the offending line. self.tag = value What am I doing wrong? -- http://mail.python.org/mailman/listinfo/python-list
[issue2657] curses
New submission from Fernando Pérez [EMAIL PROTECTED]: Curses sometimes fails to correctly initialize the terminal. Unfortunately I don't know how to reproduce the problem, it was reported multiple times by ipython users, but I have no idea what causes it. I finally found a workaround by making a termios call that at least restores terminal state (see attachment), but it's just a workaround, not a real fix. The issue manifests itself as follows: at some point (I don't know why), a call to curses.initscr() doesn't actually set the terminal in the usual mode where input isn't accepted, but instead the terminal continues accepting normal input, issuing newlines, etc. The only sign that curses is active is that in a modern terminal, the scrollbar changes to fill the screen. After this, calling curses.endwin(), instead of restoring terminal state, leaves the terminal in the mode that typically initscr() would put it in: no input is displayed, printouts swallow end-of-line characters, etc. When this happened in ipython sessions, we'd just suggest users call !reset (the system command), which would restore terminal state. But the problem is obviously in curses itself, because once this problem appeared, running the attached script would always print 'False' for the state condition checked there. For now in IPython we have a workaround, but perhaps with this little description/example, someone who knows the curses code might be able to actually fix the real problem. If I find a reliable way to trigger the bug, I'll add comments here indicating so. -- components: Extension Modules files: cursesbug.py messages: 65626 nosy: fer_perez severity: normal status: open title: curses type: behavior versions: Python 2.5 Added file: http://bugs.python.org/file10059/cursesbug.py __ Tracker [EMAIL PROTECTED] http://bugs.python.org/issue2657 __ ___ Python-bugs-list mailing list Unsubscribe: http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
Re: Strange behavior of __get__ in a descriptor in IPython
Jakub Hegenbart wrote: Hi, I'm studying the descriptor protocol and its usage from the following document: http://users.rcn.com/python/download/Descriptor.htm There is some sample code: http://users.rcn.com/python/download/Descriptor.htm#descriptor-example that behaves in a different way on my machine than the example suggests: In [2]: a=MyClass() In [3]: a.x Retrieving var x Retrieving var x Out[3]: 1 On the other hand, in the 'plain' Python shell, it's invoked only once as expected: a=desc.MyClass() a.x Retrieving var x 10 Should I take as granted that IPython might in some cases access an attribute of an object more than once even in face of side effects, or is this a bug? Yup, IPython does access attributes more than once in an attempt to determine if things can be called as functions. This behavior, however, only exists if 'autocall' is active. Here's an example: In [1]: run desc In [2]: m.x Retrieving var x Retrieving var x Out[2]: 10 In [3]: m.x Retrieving var x Retrieving var x Out[3]: 10 In [4]: autocall 0 Automatic calling is: OFF In [5]: m.x Retrieving var x Out[5]: 10 As you can see, once autocall is disabled, the double access goes away. There really is no way to provide the autocall feature without any side effects whatsoever, so if you need to avoid them at all costs, disable this feature. You can do it permanently by editing your ipythonrc file. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Confused about closures and scoping rules
Hi all, consider the following small example: Small test to try to understand a strange subtlety with closures def outer(nmax): aa = [] for n in range(nmax): def a(y): return (y,n) print 'Closure and cell id:',id(a.func_closure),\ id(a.func_closure[0]) aa.append(a) return aa print 'Closure creation.' nmax = 3 aa = outer(nmax) print print 'Closure use.' for n in range(nmax): print '%s:%s' % (n,aa[n]('hello')) ## EOF # If I run this, I get: planck[test] python debug_closures.py Closure creation. Closure and cell id: 1075998828 1075618940 Closure and cell id: 1075999052 1075618940 Closure and cell id: 1075999084 1075618940 Closure use. 0:('hello', 2) 1:('hello', 2) 2:('hello', 2) My confusion arises from the printout after 'closure use'. I was expecting that each new function 'a' created inside the loop in 'outer' would capture the value of n, therefore my expectation was to see a printout like: 0:('hello', 0) 1:('hello', 1)... etc. However, what happens is a bit different. As can be seen from the printouts of 'Closure and cell id', in each pass of the loop a new closure is created, but it reuses the *same* cell object every time. For this reason, all the closures end up sharing the scope with the values determined by the *last* iteration of the loop. This struck me as counterintuitive, but I couldn't find anything in the official docs indicating what the expected behavior should be. Any feedback/enlightenment would be welcome. This problem appeared deep inside a complicated code and it took me almost two days to track down what was going on... Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Confused about closures and scoping rules
Diez B. Roggisch wrote: It's a FAQ. The reason is that the created closures don't capture the _value_, but the _name_. Plus of course the locals()-dictionary outside the function a to perform the lookup of that name. Which has the value bound to it in the last iteration. Common cure for this is to create an a-local name that shadows the outer variable and is simultaneously bound to the desired value: Many thanks (also to JP) for the clear explanation. Greatly appreciated. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Yet another comparison of Python Web Frameworks
Massimo Di Pierro wrote: P.S. Michele Simionato. I have heard your name before? Is it possible we have met in Pisa in 1990-1996? I am also a Quantum Field Theorist and there is not many of us. More than you think, it seems. Some of us were even using python to process Lattice QCD computations years ago ;-) Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Yet another comparison of Python Web Frameworks
Massimo Di Pierro wrote: happy to hear that. you may want take a loot at http://mdp.cti.depaul.edu/vqcd It is mostly python stuff and will post the code soon. Ah, memories :) I'm not working on QCD anymore, but I did write a bunch of code a while back to script Mayavi (the old one, not the new Mayavi2) to auto-generate webpages slicing through topological charge configurations, looking for instanton molecules. Your images are a bit small for me to tell what toolkit you're using (Mayavi is VTK based), but if you want, I'd be happy to send you that old code. It probably won't do anything you don't already have, but it was handy. It would slice in any direction (typically t) and generate one PNG per value, with the remaining 3 variables displayed, and isosurfaces computed at 1/4 and 3/4s of the total distribution of values, to conveniently see the instanton/anti-instanton pairs. By spitting out one webpage per field configuration, I could let it run overnight and then the next day quickly have an overview of all the configurations I had (or show them to my advisor). The code was written to read the MILC v5 binary lattice formats, which is what I was using. Just drop me a line if you happen to want any of that code. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Any syntactic cleanup likely for Py3? And what about doc standards?
Ferenczi Viktor wrote: Properties are very useful, since ordinary attribute access can be transparently replaced with properties if the developer needs to add code when it's set or needs to calculate it's value whenever it is read. As an additional benefit this could allow developers to implement change events for properties that allows any number of observers to monitor property changes. This could be very useful in GUI programming, for example. You might be interested in Traits: http://code.enthought.com/traits/ Beautiful technology, precisely for the use cases you list, amongst many others. cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Registering a python function in C
Is Maya a different python build than what is contained at python.org? If so, I suggest you get your C program to work with the latest python build from python.org. Then see if you can get it to work with the Maya version. Ok, did that. If I write a normal C++ program and use the python installed in my system, everything works ok and I can call the python funtions. From within maya(where the C code is running as a plugin), nothing happens. I tried removing my python installation so that only the one that comes with maya is running, but then I have no python.h or libs to compile against!! I found no help at the maya/python newsgroup, is there anyone who has done this before??? Thanks for all the help! -- http://mail.python.org/mailman/listinfo/python-list
Re: Registering a python function in C
Thanks for the responses. To be more specific, this code is part of a Maya plugin. The funcion MFnPlugin::registerUI takes a pointer to a PyObject which is the function that will set up the UI for that plugin. The code Matimus posted seems to me exactly like what I need to do, except that maya gives me an error when I call PyImport_ImportModule... I don't even have a chance to check the return value, Maya simply gives up. I have checked that python is initialized by the time I call this function, and the python path is correct, I can load the module from the maya python interpreter. What bugs me is that PyImport_ImportModule doesn't even return, it should return 0 if something bad happened, right? Here's my code: if(Py_IsInitialized()) cout python is already initialized endl; if(!Py_IsInitialized()){ cout had do initialize python endl; Py_Initialize(); } PyObject* mod= PyImport_ImportModule(vzPyTest); if(mod == 0){ cout didn't load endl; } PyObject* func1 = PyObject_GetAttrString(mod, vzPyTest.test1); PyObject* func2 = PyObject_GetAttrString(mod, vzPyTest.test2); plugin.registerUI(func1, func2); Thanks for the help!!! -- http://mail.python.org/mailman/listinfo/python-list
Registering a python function in C
Could someone post an example on how to register a python function as a callback in a C function? It expects a pointer to PyObject... how do I expose that? Basically, the signature of the function is foo(PyObject* obj), where obj is the callback function... It's not exactly extending or embedding, I've looked at those examples but they don't really show how to do this... Thanks for the help! -- http://mail.python.org/mailman/listinfo/python-list
Bug in execfile?
Hi all, I'm finding the following behavior truly puzzling, but before I post a bug report on the site, I'd rather be corrected if I'm just missing somethin obvious. Consider the following trivial script: # Simple script that imports something from the stdlib from math import sin, pi wav = lambda k,x: sin(2*pi*k*x) print wav(1,0.25) # EOF The above runs just fine from a prompt, or even interactively via execfile(). Now, consider calling it by using this instead: #!/usr/bin/env python Test for a bug (?) in scope handling by the execfile() builtin. def runscript(fname): Run a file by calling execfile(). execfile(fname) # Note: if you activate this section so that execfile() is directly called # first, then the bug below disappears!!! if 0: print '='*80 print '# Trying execfile:' execfile('execfilebugscript.py') # The bug: we get an exception from running the little script, where the 'sin' # name imported from math is not visible to the lambda. print '-'*80 print '# Trying the runscript wrapper:' runscript('execfilebugscript.py') # EOF If I run the above, calling the first script 'execfilebugscript.py' and the second 'execfilebug.py', I get this: planck[test] ./execfilebug.py # Trying the runscript wrapper: Traceback (most recent call last): File ./execfilebug.py, line 21, in module runscript('execfilebugscript.py') File ./execfilebug.py, line 7, in runscript execfile(fname) File execfilebugscript.py, line 6, in module print wav(1,0.25) File execfilebugscript.py, line 4, in lambda wav = lambda k,x: sin(2*pi*k*x) NameError: global name 'sin' is not defined WTF??? Now even weirder, if the 'if 0' is turned into 'if 1' so that *first* execfile is called at the top-level (not inside a function), then *both* calls work: planck[test] ./execfilebug.py # Trying execfile: 1.0 # Trying the runscript wrapper: 1.0 I'm really, really puzzled by this. From reading the execfile() docs, I had the hunch to change the call to: execfile(fname,{}) and now the problem disappears, so I can keep on working. But I'm still very bothered by the fact that changing that first call 'if 0' to 'if 1' has any effect on the later call to runscript(). That really doesn't feel right to me... Any wisdom will be much appreciated. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Bug in execfile?
Ian Clark wrote: Fernando Perez wrote: Hi all, (snip) I'm really, really puzzled by this. From reading the execfile() docs, I had the hunch to change the call to: execfile(fname,{}) and now the problem disappears, so I can keep on working. But I'm still very bothered by the fact that changing that first call 'if 0' to 'if 1' has any effect on the later call to runscript(). That really doesn't feel right to me... [...] By default execfile works on the *current* namespace. So exec'ing a script that modified it's global namespace will also modify the global namespace of the calling module (see my first example). If you specify a dictionary then execfile will use that as the global and local (maybe) namespace of the file that it is running (hence the global namespace of the calling module stays in tact per the second example). That is why execfile(fname, {}) works for you, it doesn't pollute your current namespace. It uses a different namespace when calling the file then is being used by the calling module. Right, it makes complete sense that the first execfile() call puts the symbols in the global namespace, and that's why the failure disappears when calling the runscript wrapper. I just didn't think carefully enough, thanks. No idea why your function failed though. *shrug* Yes, that still bothers me. That a simple def runscript(f): execfile(f) won't work just like execfile would is, at the least, a bit puzzling to me. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Plotting Images
Pei-Yu CHAO wrote: Hi ALL: I have only been switched from matlab to python few months ago. I having trouble of plotting images from a matrix size of 8x1 (unfortunately that is the size of my data.) for example, x = rand(8,1) inshow(x) Read the docstrings, they explain how to use the function: imshow(x,aspect=300) gives a reasonable size to look at such a thin matrix, after you widen the window a fair bit. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
[ANN] IPython 0.8.0 is out
Hi all, The IPython team is happy to release version 0.8.0, with a lot of new enhancements, as well as many bug fixes. We hope you all enjoy it, and please report any problems as usual. WHAT is IPython? 1. An interactive shell superior to Python's default. IPython has many features for object introspection, system shell access, and its own special command system for adding functionality when working interactively. 2. An embeddable, ready to use interpreter for your own programs. IPython can be started with a single call from inside another program, providing access to the current namespace. 3. A flexible framework which can be used as the base environment for other systems with Python as the underlying language. 4. A shell for interactive usage of threaded graphical toolkits. IPython has support for interactive, non-blocking control of GTK, Qt and WX applications via special threading flags. The normal Python shell can only do this for Tkinter applications. Where to get it --- IPython's homepage is at: http://ipython.scipy.org and downloads are at: http://ipython.scipy.org/dist We've provided: - Source download (.tar.gz) - A Python Egg (http://peak.telecommunity.com/DevCenter/PythonEggs). - A python 2.4 RPM. - A native win32 installer. The egg is 'light', as it doesn't include documentation and other ancillary data. If you want a full ipython installation, use the source tarball or your distribution's favorite system. We note that IPython is now officially part of most major Linux and BSD distributions, so packages for this version should be coming soon, as the respective maintainers have the time to follow their packaging procedures. Many thanks to the distribution packagers for their work, which helps users get IPython more conveniently. Thanks to all the users who contributed bug reports, ideas and especially patches. The ChangeLog has hopefully detailed acknowledgements, but please let us know if we've accidentally ommitted giving you due credit. Many thanks to Enthought for their continued hosting support for IPython. Release notes - As always, the full ChangeLog is at http://ipython.scipy.org/ChangeLog. The highlights of this release follow. Also see the What's New page at http://ipython.scipy.org/moin/WhatsNew * Support for KeyboardInterrupt (Ctrl-C) when running in multithreaded mode with GUI support. This had been a long-requested feature that we had never quite been able to implement correctly. Many thanks to Tomer Filiba's for his ctypes-based trick: http://sebulba.wikispaces.com/recipe+thread2, which we used (any implementation mistakes are our own and not his fault). Users of Python 2.4 should note that they need to install ctypes separately to access this feature; ctypes is part of Python 2.5 already. * Fully syntax-highlighted tracebacks and debugger listings. IPython used to color tracebacks, but only certain elements; now the source is actually highlighted by the same engine that handles '??' source listings, both in tracebacks and during interactive debugging. * Improved the ipipe system: http://ipython.scipy.org/moin/UsingIPipe, including a new WX-based graphical browser. * Much improved unicode support. There may still be bugs remaining, but a number of known-incorrect cases have been fixed. * Make the execution of 'from pylab import *' when -pylab is given be otional. A new flag (which can be set in your ipythonrc file), pylab_import_all controls this behavior, the default is True for backwards compatibility. * Extensions for perforce support via a new magic (%p4) and custom command completers. * Improved support for (X)Emacs under win32. * Several small fixes and improvements to the interactive demo module. * Add \N for the actual prompt number, without any coloring, as an escape for customized prompt definitions. This lets users write their own custom prompts with arbitrary coloring schemes. * Many more bugfixes and small features everywhere (the ChangeLog linked above has the gory details). API changes: * genutils.clock() now returns user+system time. The new clocku/clocks functions return respectively user and system time only. Enjoy, and as usual please report any problems. The IPython team. -- http://mail.python.org/mailman/listinfo/python-list
Re: ipython env
Larry Bates wrote: Greg Donald wrote: Anyone know what's up with environment variables when using ipython? [...] In Cpython you get this with: import os os.environ['EDITOR'] Yup, same in ipython :) Just to clarify, env is just a convenience function in ipython that simply does this: In [4]: env?? Type: Magic function Base Class: type 'instancemethod' String Form:bound method InteractiveShell.magic_env of IPython.iplib.InteractiveShell object at 0x402046ec Namespace: IPython internal File: /home/fperez/usr/lib/python2.4/site-packages/IPython/Magic.py Definition: env(self, parameter_s='') Source: def magic_env(self, parameter_s=''): List environment variables. return os.environ.data Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Rational numbers
[EMAIL PROTECTED] wrote: On Feb 25, 3:09 pm, Fernando Perez [EMAIL PROTECTED] wrote: Alex, have you had a look at SAGE? http://modular.math.washington.edu/sage/ it uses GMP extensively, so they've had to patch it to work around these issues. You can look at the SAGE release (they package everything as the original tarball + patches) for the GMP-specific stuff you need, though I suspect you'll still want to play with SAGE a little bit :). It's a mighty impressive system. Thanks Fernando, I will take a look at that. Just to save you a bit of time, if you unpack the source tarball, inside the distro you'll find a bunch of .spkg files. Those are really .tar.bz2 files for each included package, containing the original plus any SAGE patches and some install-related scripts. The GMP .spkg has all the OSX patches, plus a few more. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Rational numbers
[EMAIL PROTECTED] wrote: gmpy itself is or should be pretty trivial to build on any platform (and I'll always happily accept any fixes that make it better on any specific platform, since it's easy to make them conditional so they'll apply to that platform only), but the underlying GMP is anything but:- (. Alex, have you had a look at SAGE? http://modular.math.washington.edu/sage/ it uses GMP extensively, so they've had to patch it to work around these issues. You can look at the SAGE release (they package everything as the original tarball + patches) for the GMP-specific stuff you need, though I suspect you'll still want to play with SAGE a little bit :). It's a mighty impressive system. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: pylab, integral of sinc function
Schüle Daniel wrote: Hello, In [19]: def simple_integral(func,a,b,dx = 0.001): : return sum(map(lambda x:dx*x, func(arange(a,b,dx : In [20]: simple_integral(sin, 0, 2*pi) Out[20]: -7.5484213527594133e-08 ok, can be thought as zero In [21]: simple_integral(sinc, -1000, 1000) Out[21]: 0.99979735786416357 hmm, it should be something around pi it is a way too far from it, even with a=-1,b=1 In [22]: def ppp(x): : return sin(x)/x : In [23]: simple_integral(ppp, -1000, 1000) Out[23]: 3.1404662440661117 nice is my sinc function in pylab broken? is there a better way to do numerical integration in pylab? Pylab is mostly a plotting library, which happens (for historical reasons I won't go into) to expose a small set of numerical algorithms, most of them actually residing in Numpy. For a more extensive collection of scientific and numerical algorithms, you should look into using SciPy: In [34]: import scipy.integrate In [35]: import scipy as S In [36]: import scipy.integrate In [37]: S.integrate. S.integrate.Inf S.integrate.composite S.integrate.NumpyTest S.integrate.cumtrapz S.integrate.__all__ S.integrate.dblquad S.integrate.__class__ S.integrate.fixed_quad S.integrate.__delattr__ S.integrate.inf S.integrate.__dict__ S.integrate.newton_cotes S.integrate.__doc__ S.integrate.ode S.integrate.__file__ S.integrate.odeint S.integrate.__getattribute__ S.integrate.odepack S.integrate.__hash__ S.integrate.quad S.integrate.__init__ S.integrate.quad_explain S.integrate.__name__ S.integrate.quadpack S.integrate.__new__ S.integrate.quadrature S.integrate.__path__ S.integrate.romb S.integrate.__reduce__S.integrate.romberg S.integrate.__reduce_ex__ S.integrate.simps S.integrate.__repr__ S.integrate.test S.integrate.__setattr__ S.integrate.tplquad S.integrate.__str__ S.integrate.trapz S.integrate._odepack S.integrate.vode S.integrate._quadpack These will provide dramatically faster performance, and far better algorithmic control, than the simple_integral: In [4]: time simple_integral(lambda x:sinc(x/pi), -100, 100) CPU times: user 7.08 s, sys: 0.42 s, total: 7.50 s Wall time: 7.58 Out[4]: 3.1244509352 In [40]: time S.integrate.quad(lambda x:sinc(x/pi), -100, 100) CPU times: user 0.05 s, sys: 0.00 s, total: 0.05 s Wall time: 0.06 Out[40]: (3.124450933778113, 6.8429604895257158e-10) Note that I used only -100,100 as the limits so I didn't have to wait forever for simple_integral to finish. As you know, this is a nasty, highly oscillatory integral for which almost any 'black box' method will have problems, but at least scipy is nice enough to let you know: In [41]: S.integrate.quad(lambda x:sinc(x/pi), -1000, 1000) Warning: The maximum number of subdivisions (50) has been achieved. If increasing the limit yields no improvement it is advised to analyze the integrand in order to determine the difficulties. If the position of a local difficulty can be determined (singularity, discontinuity) one will probably gain from splitting up the interval and calling the integrator on the subranges. Perhaps a special-purpose integrator should be used. Out[41]: (3.5354545588973298, 1.4922039610659907) In [42]: S.integrate.quad(lambda x:sinc(x/pi), -1000, 1000,limit=1000) Out[42]: (3.1404662439375475, 4.5659823144674379e-08) Cheers, f ps - the 2nd number is the error estimate. -- http://mail.python.org/mailman/listinfo/python-list
Re: Maybe a little bug of ipython 0.7.3 ?
[EMAIL PROTECTED] wrote: I'm new to ipython, and i found it a very cool product. Glad you like it, though in the future I recommend you post on the ipython list. I very rarely scan c.l.py these days, unfortunately. $ ipython Python 2.5 (r25:51908, Sep 19 2006, 09:52:17) [MSC v.1310 32 bit (Intel)] Type copyright, credits or license for more information. IPython 0.7.3 -- An enhanced Interactive Python. snip In [8]: a = range(1000) In [9]: a? Type: list Base Class: type 'list' String Form:[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26 ... 0, 981, 98 2, 983, 984, 985, 986, 987, 988, 989, 990, 991, 992, 993, 994, 995, 996, 997, 998, 999] Namespace: Interactive Length: 1000 Docstring: list() - new list list(sequence) - new list initialized from sequence's items *Please note that there is an extra 0 after **26 ..., which doesn't appear for the followling cases:* The 'foo?' feature just gives you a summary of information about an object, and for very long strings, it truncates them in the center, printing only head...tail where head and tail are each about 100 characters long. What you are seeing is just an accident of where the truncation happens, that '0' is the last digit in 980 :) Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Exploiting Dual Core's with Py_NewInterpreter's separated GIL ?
sturlamolden wrote: Following up on my previous post, there is a simple Python MPI wrapper that can be used to exploit multiple processors for scientific computing. It only works for Numeric, but an adaptaion to NumPy should be easy (there is only one small C file in the source): http://datamining.anu.edu.au/~ole/pypar/ If you are using Windows, you can get a free implementation of MPI here: http://www-unix.mcs.anl.gov/mpi/mpich1/mpich-nt/ It's also worth noting http://mpi4py.scipy.org The project just moved over to scipy hosting a few days ago, and the boxes haven't been unpacked yet. But it is very actively developed, has full numpy and MPI-2 support, and is looking very nice. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: switching to numpy and failing, a user story
[EMAIL PROTECTED] wrote: After using numeric for almost ten years, I decided to attempt to switch a large codebase (python and C++) to using numpy. Here's are some comments about how that went. - The code to automatically switch python stuff over just kind of works. But it was a 90% solution, I could do the rest by hand. Of course, the problem is that then the code is still using the old numeric API, so it's not a long term solution. Unfortunately, to switch to the numpy API one needs documentation, which is a problem; see below. [ RNG issues, already addressed ] - My extension modules just won't build because the new numpy stuff lives in a different location from where Numeric used to live. I probably could fix this, but without the documentation I can't figure out how to do that. I'd also need to figure out how to port my code to use the new numpy API instead of the compatibility layer, but I can't do that without docs either. I have to call bull on this. I happen to have the per-pay book, but I recently converted a large codebase from Numeric to Numpy, and I actually never had to open the book. Travis has made sure that the compatibility-related information is easy to find: http://www.scipy.org/Converting_from_Numeric has most of what you need, and this page: http://www.tramy.us/guidetoscipy.html contains a link to the first two chapters FOR FREE. A lot of what's needed for a port is covered in there. In addition, the C API is available here (as well as being obviously in the source, which you have access to): http://www.scipy.org/NumPyCapi And the doc page has many more useful links: http://www.scipy.org/Documentation So yes, you have to buy the book. Travis has sunk over a year of his time into an absolutely herculean effort that provides working scientists with tools that are better than anything money can buy (yes, I have access to both Matlab and IDL, and you can't pay me enough to use them instead of Python). And he has the gall to ask for some money for a 300 page book? How dare he, when one can walk into any Barnes and Noble and just walk out of the store with a cart full of books for free! It's funny how I don't see anyone complaining about any of the Python books sold here (or at any other publishing house): http://www.oreilly.com/pub/topic/python I recently was very happy to pay for the Twisted book, since I need Twisted for a project and a well-organized book is a good complement to the auto-generated documentation. And finally, if you had porting problems, none were reported on any of the numpy/scipy mailing lists (else you used a different name or email, since I can't find traces of queries from you my gmail archive where I keep everything posted on all the lists): http://www.scipy.org/Mailing_Lists Lots of people have been porting their codes recently, and inevitably some have run into difficulties. EVERY single time when they actually say something on the list, Travis (and others as well) is very fast with specific help on exactly how to solve the problems, or with bug fixes when the problem happens to be a numpy bug discovered by the user. But don't take my word for it: http://sourceforge.net/mailarchive/forum.php?thread_id=30703688forum_id=4890 (and Francesc has found some really nasty things, given how pytables pushes numpy far beyond where Numeric ever went, so this is not a light compliment). Look, I'm sure you had issues with your code, we all have. But I want to make sure that others don't take from your message the wrong impression regarding numpy, its future, its quality as a scientific computing platform, or Travis (I'd build the man a statue if I could :). The environment which is developing around Python for scientific computing is nothing short of remarkable. If you find issues in the process, ask for help on the numpy/scipy lists and I'm sure you will receive some. But please refrain from spreading FUD. Regards, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Interact with system command prompt
billie wrote: Uhm... It seems that IPython got some problems: http://ipython.scipy.org/doc/manual/node12.html In details: Note that this does not make IPython a full-fledged system shell. In particular, it has no job control, so if you type Ctrl-Z (under Unix), you'll suspend pysh itself, not the process you just started. What the shell profile allows you to do is to use the convenient and powerful syntax of Python to do quick scripting at the command line. Note that these are simply limitations of what has been done out of developer interest, not fundamental ones of design. I'm not in the business of writing system shells, so I (nor anyone else) never implemented full-fledged job control. If you do it and send it in, we'll be happy to include it. And I suspect IPython does a LOT of stuff already that you'll spend a fair amount of time rewriting from scratch, so perhaps helping in to fill in the missing pieces might be a good investment of your time. Cheers, f (ipython lead developer) -- http://mail.python.org/mailman/listinfo/python-list
Re: Scientific computing and data visualization.
Matteo wrote: One hurdle to overcome is transferring array data from Numeric/Numpy into VTK. I have a sort of ad-hoc method to do that (mainly for volume data). If anyone knows of any elegant solution, or a module to ease the pain, I'd like to hear about it. https://svn.enthought.com/enthought/wiki/TVTK Much, much, MUCH nicer interface to VTK than the plain bindings that come by default. And built from the ground up to seamlessly couple numpy with VTK. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
Re: unit testing failure makes no sense
[EMAIL PROTECTED] wrote: I have some unit testing code in one of my modules that appears to run without an error, but the unit test fails anyhow. Have a look at the output below -- the TestResult seems to have no errors and no failures, yet I get a system exit. unittest.main() ALWAYS raises system exit: /Library/Frameworks/Python.framework/Versions/2.4/lib/python2.4/ unittest.py in runTests(self=unittest.TestProgram object at 0x10ee670) 795 self.testRunner = TextTestRunner (verbosity=self.verbosity) 796 result = self.testRunner.run(self.test) -- 797 sys.exit(not result.wasSuccessful()) global sys.exit = built-in function exit result.wasSuccessful = bound method _TextTestResult.wasSuccessful of unittest._TextTestResult run=3 errors=0 failures=0 798 799 main = TestProgram You normally don't see it b/c the python interpeter doesn't flag SystemExits at the very end (I'm not sure of the details there). For this very reason, ipython's run magic has a -e flag, which makes it ignore SystemExit exceptions, and lets you run your unit tests from within ipython in peace. Next time, ask this one on the ipython list: it was a pure accident that I caught it, I have unfortunately no time left for c.l.py these days. Regards, f -- http://mail.python.org/mailman/listinfo/python-list
Re: Matplotlib eps export
Martin Manns wrote: Hi, When I use matplotlib for a scatter plot with both dots and connecting lines, the exported eps file is huge, if the distances between many points are small. I think of this as a bug, since no preview tiff is included in the generated eps and a variety of text processing applications (including OpenOffice) crash when I try to import the eps. Ghostscript takes forever, too. Is there anything that I can do in order to export reasonable eps files? I don't know the answer to your question, but I know the probability of getting an answer will be a LOT higher if you post on the matplotlib list. c.l.py is only casually monitored by the mpl devs. Cheers, f -- http://mail.python.org/mailman/listinfo/python-list
IPython 0.7.2 is out.
Hi all, The IPython team is happy to release version 0.7.2, with a lot of new enhancements, as well as many bug fixes. We hope you all enjoy it, and please report any problems as usual. WHAT is IPython? 1. An interactive shell superior to Python's default. IPython has many features for object introspection, system shell access, and its own special command system for adding functionality when working interactively. 2. An embeddable, ready to use interpreter for your own programs. IPython can be started with a single call from inside another program, providing access to the current namespace. 3. A flexible framework which can be used as the base environment for other systems with Python as the underlying language. 4. A shell for interactive usage of threaded graphical toolkits. IPython has support for interactive, non-blocking control of GTK, Qt and WX applications via special threading flags. The normal Python shell can only do this for Tkinter applications. Where to get it --- IPython's homepage is at: http://ipython.scipy.org and downloads are at: http://ipython.scipy.org/dist We've provided: - Source download (.tar.gz) - An RPM (for Python 2.4, built under Ubuntu Dapper 6.06). - A Python Egg (http://peak.telecommunity.com/DevCenter/PythonEggs). - A native win32 installer. The egg is 'light', as it doesn't include documentation and other ancillary data. If you want a full ipython installation, use the source tarball or your distribution's favorite system. We note that IPython is now officially part of most major Linux and BSD distributions, so packages for this version should be coming soon, as the respective maintainers have the time to follow their packaging procedures. Many thanks to Jack Moffit, Norbert Tretkowski, Andrea Riciputi, Dryice Liu and Will Maier for the packaging work, which helps users get IPython more conveniently. Many thanks to Enthought for their continued hosting support for IPython. Release notes - As always, the full ChangeLog is at http://ipython.scipy.org/ChangeLog. The highlights of this release follow. Also see the What's New page at http://projects.scipy.org/ipython/ipython/wiki/WhatsNew for more details on some of these features. * Walter Doerwald's ipipe module, which provides a handy way to browse and manipulate tabular data, e.g. groups of files or environment variables (this is currently mostly a *nix feature, due to its need for ncurses). Walter is now a member of the IPython team. * The IPython project is the new home for the UNC readline extension, which allows win32 users to access readline facilities (tab completion, colored prompts, and more). UNC readline has been renamed PyReadline, and has a number of important new features, especially for users of non-US keyboards. See this page for more details: http://projects.scipy.org/ipython/ipython/wiki/PyReadline/Intro * A new extension and configuration API. * Hardened persistence. Persistence of data now uses pickleshare, a shelve-like module that allows concurrent access to the central ipython database by multiple ipython instances. * Simpler output capture: files=!ls will now capture the 'ls' call into the 'files' variable. * New magic functions: %timeit, %upgrade, %quickref, %cpaste, %clip, %clear. Also, a 'raw' mode has been added to %edit, %macro, %history. * Batch files. If the file ends with .ipy, you can launch it by ipython myfile.ipy. It will be executed as if it had been typed interactively (it can contain magics, aliases, etc.) * New pexpect-based 'irunner' module, to run scripts and produce all the prompts as if they had been typed one by one. This lets you reproduce a complete interactive session from a file, which can be very useful when producing documentation, for example. The module provides default runners for ipython, plain python and SAGE (http://sage.scipy.org). Users can subclass the base runner to produce new ones for any interactive system whose prompts are predictable (such as gnuplot, a system shell, etc.). * New option to log 'raw' input into IPython's logs. The logs will then be valid .ipy batch scripts just as you typed them, instead of containing the converted python source. * Fixes and improvements to (X)Emacs support. PDB auto-tracking is back (it had broken in 0.7.1, and auto-indent now works inside emacs ipython buffers. You will need to update your copy of ipython.el, which you can get from the doc/ directory. A copy is here, for convenience: http://ipython.scipy.org/dist/ipython.el * The ipapi system offers a new to_user_ns() method in the IPython object, to inject variables from a running script directly into the user's namespace. This lets you have internal variables from a script visible interactively for further manipulation after %running it. * Thanks to Will Maier, IPython is now
IPython 0.7.2 is out.
Hi all, The IPython team is happy to release version 0.7.2, with a lot of new enhancements, as well as many bug fixes. We hope you all enjoy it, and please report any problems as usual. WHAT is IPython? 1. An interactive shell superior to Python's default. IPython has many features for object introspection, system shell access, and its own special command system for adding functionality when working interactively. 2. An embeddable, ready to use interpreter for your own programs. IPython can be started with a single call from inside another program, providing access to the current namespace. 3. A flexible framework which can be used as the base environment for other systems with Python as the underlying language. 4. A shell for interactive usage of threaded graphical toolkits. IPython has support for interactive, non-blocking control of GTK, Qt and WX applications via special threading flags. The normal Python shell can only do this for Tkinter applications. Where to get it --- IPython's homepage is at: http://ipython.scipy.org and downloads are at: http://ipython.scipy.org/dist We've provided: - Source download (.tar.gz) - An RPM (for Python 2.4, built under Ubuntu Dapper 6.06). - A Python Egg (http://peak.telecommunity.com/DevCenter/PythonEggs). - A native win32 installer. The egg is 'light', as it doesn't include documentation and other ancillary data. If you want a full ipython installation, use the source tarball or your distribution's favorite system. We note that IPython is now officially part of most major Linux and BSD distributions, so packages for this version should be coming soon, as the respective maintainers have the time to follow their packaging procedures. Many thanks to Jack Moffit, Norbert Tretkowski, Andrea Riciputi, Dryice Liu and Will Maier for the packaging work, which helps users get IPython more conveniently. Many thanks to Enthought for their continued hosting support for IPython. Release notes - As always, the full ChangeLog is at http://ipython.scipy.org/ChangeLog. The highlights of this release follow. Also see the What's New page at http://projects.scipy.org/ipython/ipython/wiki/WhatsNew for more details on some of these features. * Walter Doerwald's ipipe module, which provides a handy way to browse and manipulate tabular data, e.g. groups of files or environment variables (this is currently mostly a *nix feature, due to its need for ncurses). Walter is now a member of the IPython team. * The IPython project is the new home for the UNC readline extension, which allows win32 users to access readline facilities (tab completion, colored prompts, and more). UNC readline has been renamed PyReadline, and has a number of important new features, especially for users of non-US keyboards. See this page for more details: http://projects.scipy.org/ipython/ipython/wiki/PyReadline/Intro * A new extension and configuration API. * Hardened persistence. Persistence of data now uses pickleshare, a shelve-like module that allows concurrent access to the central ipython database by multiple ipython instances. * Simpler output capture: files=!ls will now capture the 'ls' call into the 'files' variable. * New magic functions: %timeit, %upgrade, %quickref, %cpaste, %clip, %clear. Also, a 'raw' mode has been added to %edit, %macro, %history. * Batch files. If the file ends with .ipy, you can launch it by ipython myfile.ipy. It will be executed as if it had been typed interactively (it can contain magics, aliases, etc.) * New pexpect-based 'irunner' module, to run scripts and produce all the prompts as if they had been typed one by one. This lets you reproduce a complete interactive session from a file, which can be very useful when producing documentation, for example. The module provides default runners for ipython, plain python and SAGE (http://sage.scipy.org). Users can subclass the base runner to produce new ones for any interactive system whose prompts are predictable (such as gnuplot, a system shell, etc.). * New option to log 'raw' input into IPython's logs. The logs will then be valid .ipy batch scripts just as you typed them, instead of containing the converted python source. * Fixes and improvements to (X)Emacs support. PDB auto-tracking is back (it had broken in 0.7.1, and auto-indent now works inside emacs ipython buffers. You will need to update your copy of ipython.el, which you can get from the doc/ directory. A copy is here, for convenience: http://ipython.scipy.org/dist/ipython.el * The ipapi system offers a new to_user_ns() method in the IPython object, to inject variables from a running script directly into the user's namespace. This lets you have internal variables from a script visible interactively for further manipulation after %running it. * Thanks to Will Maier, IPython is now
Is the standard output thread-safe?
Hi, Is the standard output thread-safe? Can I use print from several threads without having to use a mutex? Thanks -- http://mail.python.org/mailman/listinfo/python-list
Blocking a thread for x seconds
Hi, I have a class that descends from threading.Thread. One method should block the thread during x seconds and then call another method. How can I do this? -- http://mail.python.org/mailman/listinfo/python-list