Re: pyyaml 6

2022-11-02 Thread Gordon Ball

On 19/10/2022 22:30, Gordon Ball wrote:

On 09/10/2022 21:39, Gordon Ball wrote:

On 07/10/2022 01:13, Timo Röhling wrote:

Hi Gordon,

* Gordon Ball  [2022-10-07 00:10]:

* Upload to unstable and see what breaks?
* Request an archive rebuild with this version and see what breaks?
* File bugs against all likely affected packages with a fixed date for
an upload?
* Wait until after the freeze?

Considering that PyYAML has been issuing a YAMLLoadWarning since
version 5.1 (i.e. September 2019), I would expect that many (most?)
reverse dependencies have been fixed, and anything that still breaks
is probably in dire need of maintenance anyway.


Yes, that's a good point.

I've done some more investigation of reverse-depends and looking for
likely bad patterns with codesearch.d.n, and the set of packages which
A) appear to contain plain `yaml.load(f)` or equivalent, where `yaml` is
pyyaml and B) actually depend or build-depend on `python3-yaml` is
smaller than I feared. This is what I came up with:

```
ansible # confirm - in ansible_collections/community/general/plugins
buildbot # confirm - in porttostable
ceph # confirm, in civetweb/third_party/duktape, src/test/crimson
docker-compose # confirm, in tests
dose3 # confirm, in scripts
elasticsearch-curator # confirm, in test/unit
etm # confirm, in etmTk/data
ganeti # confirm, in qa, test/py
gnocchi # confirm, in gnocchi/gendoc
gr-dab # confirm, in python/app
jeepyb # confirm, in cmd/notify_impact
knitpy # confirm, in knitpy
labgrid # confirm, in remote/coordinator
lecm # confirm, in lecm/configuration
lirc # confirm, in lirc/database
lqa # confirm, in lqa_api/job
open-adventure # confirm, in tests
owslib # confirm, in ogcapi
policyd-rate-limit # confirm, in config, tests
python-aptly # confirm, in publisher
python-canmatrix # confirm, in canmatrix/formats/yaml
python-multipart # confirm, in multipart/tests
python-pybedtools # confirm, in test, contrib
python-seqcluster # confirm, in install
python-tempestconf # confirm, in config_tempest/profile
qcat # confirm, in adapters
qcengine # confirm, in config
refstack-client # confirm, in refstack_client
relatorio # confirm, in templates/chart
ripe-atlas-tools # confirm, in tools/settings
ros-genpy # confirm, in test
ros-rosinstall # confirm, in test, src/rosinstall
ros-rosinstall-generator # confirm, in test, src/rosinstall_generator
spades # confirm, in assembler/src/test/teamcity,
assembler/src/projects/mts, assembler/src/spades_pipeline
xrstools # confirm, in WIZARD
zlmdb # confirm, in _database
```

I don't know if all these codepaths are actually ever reached, or tests
ever run, so the packages won't necessarily break (or the breakage is
very minor, such as a single community plugin in ansible). I don't
guarantee there are no omissions from the list though.

There is a significantly longer list of packages which appear to contain
a use of yaml.load _somewhere_, but it is usually in some
maintainer/release script, or an optional path somewhere, and the
package itself doesn't [build-]depend on python3-yaml.



I've filed bugs for (most) of the packages listed above (some were 
dropped on review because they weren't in main, or the affected file 
appeared to be unused at build time and not installed). I've had a 
couple of acknowledgements/fixes already.


A number of affected package look very sparsely used and/or maintained, 
and I doubt are likely to be fixed. However, of the packages above, only 
two have autopkgtests which fail on experimental migration, so I 
wouldn't be surprised if some are not already broken by other past changes.


I propose to wait ~2 weeks until the beginning of November and upload 
pyyaml 6 to unstable then.




pyyaml 6 has now landed in unstable. About half the bugs I filed have 
been resolved, and there is only one package (ganeti) with autopkgtests 
blocking migration.




Re: pyyaml 6

2022-10-19 Thread Gordon Ball

On 09/10/2022 21:39, Gordon Ball wrote:

On 07/10/2022 01:13, Timo Röhling wrote:

Hi Gordon,

* Gordon Ball  [2022-10-07 00:10]:

* Upload to unstable and see what breaks?
* Request an archive rebuild with this version and see what breaks?
* File bugs against all likely affected packages with a fixed date for
an upload?
* Wait until after the freeze?

Considering that PyYAML has been issuing a YAMLLoadWarning since
version 5.1 (i.e. September 2019), I would expect that many (most?)
reverse dependencies have been fixed, and anything that still breaks
is probably in dire need of maintenance anyway.


Yes, that's a good point.

I've done some more investigation of reverse-depends and looking for
likely bad patterns with codesearch.d.n, and the set of packages which
A) appear to contain plain `yaml.load(f)` or equivalent, where `yaml` is
pyyaml and B) actually depend or build-depend on `python3-yaml` is
smaller than I feared. This is what I came up with:

```
ansible # confirm - in ansible_collections/community/general/plugins
buildbot # confirm - in porttostable
ceph # confirm, in civetweb/third_party/duktape, src/test/crimson
docker-compose # confirm, in tests
dose3 # confirm, in scripts
elasticsearch-curator # confirm, in test/unit
etm # confirm, in etmTk/data
ganeti # confirm, in qa, test/py
gnocchi # confirm, in gnocchi/gendoc
gr-dab # confirm, in python/app
jeepyb # confirm, in cmd/notify_impact
knitpy # confirm, in knitpy
labgrid # confirm, in remote/coordinator
lecm # confirm, in lecm/configuration
lirc # confirm, in lirc/database
lqa # confirm, in lqa_api/job
open-adventure # confirm, in tests
owslib # confirm, in ogcapi
policyd-rate-limit # confirm, in config, tests
python-aptly # confirm, in publisher
python-canmatrix # confirm, in canmatrix/formats/yaml
python-multipart # confirm, in multipart/tests
python-pybedtools # confirm, in test, contrib
python-seqcluster # confirm, in install
python-tempestconf # confirm, in config_tempest/profile
qcat # confirm, in adapters
qcengine # confirm, in config
refstack-client # confirm, in refstack_client
relatorio # confirm, in templates/chart
ripe-atlas-tools # confirm, in tools/settings
ros-genpy # confirm, in test
ros-rosinstall # confirm, in test, src/rosinstall
ros-rosinstall-generator # confirm, in test, src/rosinstall_generator
spades # confirm, in assembler/src/test/teamcity,
assembler/src/projects/mts, assembler/src/spades_pipeline
xrstools # confirm, in WIZARD
zlmdb # confirm, in _database
```

I don't know if all these codepaths are actually ever reached, or tests
ever run, so the packages won't necessarily break (or the breakage is
very minor, such as a single community plugin in ansible). I don't
guarantee there are no omissions from the list though.

There is a significantly longer list of packages which appear to contain
a use of yaml.load _somewhere_, but it is usually in some
maintainer/release script, or an optional path somewhere, and the
package itself doesn't [build-]depend on python3-yaml.



I've filed bugs for (most) of the packages listed above (some were 
dropped on review because they weren't in main, or the affected file 
appeared to be unused at build time and not installed). I've had a 
couple of acknowledgements/fixes already.


A number of affected package look very sparsely used and/or maintained, 
and I doubt are likely to be fixed. However, of the packages above, only 
two have autopkgtests which fail on experimental migration, so I 
wouldn't be surprised if some are not already broken by other past changes.


I propose to wait ~2 weeks until the beginning of November and upload 
pyyaml 6 to unstable then.





Re: pyyaml 6

2022-10-09 Thread Gordon Ball
On 07/10/2022 01:13, Timo Röhling wrote:
> Hi Gordon,
> 
> * Gordon Ball  [2022-10-07 00:10]:
>> * Upload to unstable and see what breaks?
>> * Request an archive rebuild with this version and see what breaks?
>> * File bugs against all likely affected packages with a fixed date for
>> an upload?
>> * Wait until after the freeze?
> Considering that PyYAML has been issuing a YAMLLoadWarning since
> version 5.1 (i.e. September 2019), I would expect that many (most?)
> reverse dependencies have been fixed, and anything that still breaks
> is probably in dire need of maintenance anyway.

Yes, that's a good point.

I've done some more investigation of reverse-depends and looking for
likely bad patterns with codesearch.d.n, and the set of packages which
A) appear to contain plain `yaml.load(f)` or equivalent, where `yaml` is
pyyaml and B) actually depend or build-depend on `python3-yaml` is
smaller than I feared. This is what I came up with:

```
ansible # confirm - in ansible_collections/community/general/plugins
buildbot # confirm - in porttostable
ceph # confirm, in civetweb/third_party/duktape, src/test/crimson
docker-compose # confirm, in tests
dose3 # confirm, in scripts
elasticsearch-curator # confirm, in test/unit
etm # confirm, in etmTk/data
ganeti # confirm, in qa, test/py
gnocchi # confirm, in gnocchi/gendoc
gr-dab # confirm, in python/app
jeepyb # confirm, in cmd/notify_impact
knitpy # confirm, in knitpy
labgrid # confirm, in remote/coordinator
lecm # confirm, in lecm/configuration
lirc # confirm, in lirc/database
lqa # confirm, in lqa_api/job
open-adventure # confirm, in tests
owslib # confirm, in ogcapi
policyd-rate-limit # confirm, in config, tests
python-aptly # confirm, in publisher
python-canmatrix # confirm, in canmatrix/formats/yaml
python-multipart # confirm, in multipart/tests
python-pybedtools # confirm, in test, contrib
python-seqcluster # confirm, in install
python-tempestconf # confirm, in config_tempest/profile
qcat # confirm, in adapters
qcengine # confirm, in config
refstack-client # confirm, in refstack_client
relatorio # confirm, in templates/chart
ripe-atlas-tools # confirm, in tools/settings
ros-genpy # confirm, in test
ros-rosinstall # confirm, in test, src/rosinstall
ros-rosinstall-generator # confirm, in test, src/rosinstall_generator
spades # confirm, in assembler/src/test/teamcity,
assembler/src/projects/mts, assembler/src/spades_pipeline
xrstools # confirm, in WIZARD
zlmdb # confirm, in _database
```

I don't know if all these codepaths are actually ever reached, or tests
ever run, so the packages won't necessarily break (or the breakage is
very minor, such as a single community plugin in ansible). I don't
guarantee there are no omissions from the list though.

There is a significantly longer list of packages which appear to contain
a use of yaml.load _somewhere_, but it is usually in some
maintainer/release script, or an optional path somewhere, and the
package itself doesn't [build-]depend on python3-yaml.


> 
> 
> Cheers
> Timo
> 



pyyaml 6

2022-10-06 Thread Gordon Ball
pyyaml (aka python3-yaml) is an rdepend for >300 packages. We currently
have 5.4.1, but version 6 was released late last year, which does quite
a lot of cleanup (eg, dropping python 2 support) and disables unsafe
loading (arbitrary python code execution) unless explicitly opted into.

Unfortunately this is a breaking change for any code which uses the
simple `yaml.load(fileobj)` idiom - it now needs to be
`yaml.safe_load(fileobj)` or `yaml.load(fileobj, Loader=yaml.SafeLoader)`.

I uploaded 6.0 to experimental late last year, but time constraints
meant it never got pushed forward. I've recently refreshed the version
in experimental (experimental doesn't get binnmus so it was still only
built for 3.9), and there are relatively few autopkgtest regressions:
https://qa.debian.org/excuses.php?experimental=1=pyyaml

However, using codesearch suggests there are quite a few places which
are likely to break:
http://codesearch.debian.net/search?q=yaml%5B.%5Dload%5B%28%5D%5B%5E%2C%5D%2B%5B%29%5D+filetype%3Apython=0=1

Some of these are false positives, such as invocations of ruamel.yaml,
that string appearing in documentation, or things regex can't catch -
but that still looks like it leaves a significant number of packages
being potentially broken - and presumably lacking autopkgtest coverage).


So, I'd seek some input on how to move forward.

* Upload to unstable and see what breaks?
* Request an archive rebuild with this version and see what breaks?
* File bugs against all likely affected packages with a fixed date for
an upload?
* Wait until after the freeze?

The only bug requesting it actually be upgraded is
https://bugs.debian.org/1008262 (for openstack). I don't know if that
has proved a hard blocker - I _think_ anything designed to work with 6.x
should also work with 5.4.


Gordon



Bug#1003716: ITP: python-pure-eval -- Safely evaluate Python AST nodes

2022-01-14 Thread Gordon Ball
Package: wnpp
Severity: wishlist
Owner: Gordon Ball 
X-Debbugs-Cc: debian-python@lists.debian.org, gor...@chronitis.net

* Package name: python-pure-eval
  Version : 0.2.1
  Upstream Author : Alex Hall
* URL : https://github.com/alexmojaki/pure_eval
* License : MIT
  Programming Lang: Python
  Description : Safely evaluate Python AST nodes

This is a Python package that lets you safely evaluate certain AST nodes
without triggering arbitrary code that may have unwanted side effects.

This is a dependency on python-stack-data, itself a dependency of
IPython 8.0

This will be maintained in the debian-python team.



Bug#1003645: ITP: python-stack-data -- More useful tracebacks for python

2022-01-13 Thread Gordon Ball
Package: wnpp
Severity: wishlist
Owner: Gordon Ball 
X-Debbugs-Cc: gor...@chronitis.net, debian-python@lists.debian.org

* Package name: python-stack-data
  Version : 0.1.3
  Upstream Author : Alex Hall
* URL : https://github.com/alexmojaki/stack_data
* License : MIT
  Programming Lang: Python
  Description : More useful tracebacks for python

This library extracts more detail from stack frames and tracebacks,
showing richer debugging information, particularly for interactive use.

This is a new dependency for IPython 8.0

It will be maintained in the debian-python team.



Jupyter team?

2021-05-18 Thread Gordon Ball
On Mon, May 17, 2021 at 06:20:19PM +0200, Roland Mas wrote:
> Hi everyone,
> 
> I've been contracted by Synchrotron Soleil to work on the packaging of
> Jupyterhub and its dependencies. This turns out to about 20 Python packages,
> most of which should probably go under the Debian Python Team umbrella
> (although some may go into Debian Science). So I hereby request to be added
> to the python-team group on salsa. My salsa login is "lolando", and I have
> read and accept the 
> https://salsa.debian.org/python-team/tools/python-modules/blob/master/policy.rst
> policy.

Hello

Good to have more people working on jupyter and related tools. Can you
say what the extent of your goal here is? Does it just relate to the
jupyterhub server, spawners, proxy, etc or does your target also include
some work on the jupyter interfaces/core side?

I wonder if it is time to have a distinct jupyter packaging team given
the (perhaps concerningly?) growing size of this software stack [1].

Although I don't know how much time I'll have to work on this in the
near future (new job, etc), my broad goals for this next cycle would be:

 * See if we can get jupyterlab packaged. This has an unpleasant list of
   javascript dependencies, but now there is a v3 release I hope it
   might stabilise a bit.
 * Try and get ipywidgets into better shape - it's currently quite out
   of date, requiring patches in consuming libraries, and blocked on
   javascript problems (many of which are probably in common with
   jupyterlab)
 * Go through the list of jupyter kernels for other languages and see
   if some of these have matured and stabilised and might be good
   candidates where the given ecosystem is already well supported in
   debian - iruby, ijavascript, gophernotes (go) look like conceivable
   candidates.


Gordon


[1]: A quick survey of jupyter in debian:

Core components:

 * jupyter-core
 * jupyter-client
 * jupyter-console
 * jupyter-notebook
 * jupyter-packaging
 * jupyter-server
 * jupyterlab-pygments
 * jupyterlab-server
 * nbclient
 * nbconvert
 * nbformat
 * qtconsole

Kernels and related:

 * ipykernel
 * ipyparallel
 * ipywidgets
 * metakernel
 * nbsphinx
 * octave-kernel
 * r-cran-irdisplay
 * r-cran-irkernel
 * xeus
 * xeus-python

There are probably python or other language libraries which are
exclusively jupyter dependencies also.



python-team/packages repo move request

2020-10-26 Thread Gordon Ball
Hello

Julien sent a request to move/archive the redundant ipython repo about
two weeks ago, but it looks like it got missed by anyone with admin.

Could someone with powers:

1. Archive python-team/packages/ipython
2. Rename python-team/packages/ipython4 -> python-team/packages/ipython

The split naming is left over from an old transition years ago and it
would be good to just have the repo "ipython" instead of "ipython4",
since we're now up to version 7 and the naming no longer makes any
sense.

More general question - is there a better way of requesting this sort of
thing other than ML/IRC? Maybe we could create a pseudopackage in the
BTS, or a dummy salsa repo where we can open issues?

Thanks

Gordon



Re: [Python-modules-team] Processing of paramiko_2.7.1-1_source.changes

2020-05-12 Thread Gordon Ball
On Mon, May 11, 2020 at 10:37:47PM -0400, Sandro Tosi wrote:
> > Any trick to avoid those errors in general?
> 

You can also do

$ gbp push

which will automatically push the main (master or debian/master),
upstream and pristine-tar branches. However, by default it does not push
UNRELEASED changes (ie, the main branch only gets pushed if there is a
release tag). It certainly helps to keep the upstream and pristine-tar
branches up to date, however.

There is also equivalently

$ gbp pull
$ gbp clone

to ensure all of these branches are updated from the remote without the
need to switch to them in turn and push/pull them.

Gordon

> when i push i always do
> 
> $ git push --all ; git push --tags
> 
> i also have these in ~/.gitconfig
> 
> [push]
>default = current
>followTags = true
> 
> with should make the `git push --tags` unnecessary after `git push
> --all` (as tags will "follow" the diffs you're pushing) but it's stuck
> in my bash history so there it is
> 
> > Anyways, it should be there now!
> 
> indeed it's there -- thanks!
> 
> -- 
> Sandro "morph" Tosi
> My website: http://sandrotosi.me/
> Me at Debian: http://wiki.debian.org/SandroTosi
> Twitter: https://twitter.com/sandrotosi
> 



Re: py2removal: proposal to increase apps popcon limit

2020-03-31 Thread Gordon Ball
On Tue, Mar 31, 2020 at 10:04:34AM -0400, Scott Kitterman wrote:
> On Tuesday, March 31, 2020 9:54:41 AM EDT Sandro Tosi wrote:
> > >> Maybe you can paste the list of 20 affected apps, though?
> > > 
> > > I'm more interested in the 18 that are above 1000.  Could you please just
> > > list them all 38, sorted by popocon?
> > here's the list:
> > 
> > cvs2svn - popcon = 303,
> > chirp - popcon = 350,
> > gnat-gps - popcon = 354,
> > cfv - popcon = 387,
> > fretsonfire-game - popcon = 394,
> > neopi - popcon = 398,
> > kcachegrind-converters - popcon = 414,
> > virt-goodies - popcon = 430,
> > freeorion - popcon = 433,
> > syncthing-gtk - popcon = 435,
> > postnews - popcon = 557,
> > pssh - popcon = 562,
> > displaycal - popcon = 562,
> > archivemail - popcon = 674,
> > syslog-summary - popcon = 692,
> > xboxdrv - popcon = 696,
> > trash-cli - popcon = 753,
> > fsl-5.0-core - popcon = 822,
> > denyhosts - popcon = 852,
> > geda-utils - popcon = 938,
> > mirage - popcon = 1006,
> > gnumeric-plugins-extra - popcon = 1110,
> > getmail - popcon = 1191,
> > mailman - popcon = 1344,
> > smart-notifier - popcon = 1362,
> > xchat - popcon = 1606,
> > mysql-workbench - popcon = 1688,
> > calligra-libs - popcon = 2296,
> > mysql-utilities - popcon = 2448,
> > lokalize - popcon = 2598,
> > libboost-mpi-python1.67.0 - popcon = 2691,
> > bleachbit - popcon = 2773,
> > qbittorrent - popcon = 3402,
> > libapache2-mod-python - popcon = 3778,
> > nagios-plugins-contrib - popcon = 4059,
> > playonlinux - popcon = 4781,
> > sugar-browse-activity - popcon = 6421,
> > ipython - popcon = 8296,
> 
> Thanks.  I believe ipython will be gone shortly.

The RM bug was filed this morning. #955404
> 
> I do not see any problems with bumping to 1,000 now.
> 
> For the ones > 1,000, I think we should do some assessment of what needs to 
> happen on a case by case basis.  Some of those, I suspect, only use python2 
> for very limited purposes that could either be dropped or updated to python3.
> 
> Scott K
> 
> 
> 



Re: Bug#952952: RM: src:ipykernel-py2 -- RoM for Python 2 removal

2020-03-02 Thread Gordon Ball
On Mon, Mar 02, 2020 at 09:14:52AM -0500, Sandro Tosi wrote:
> > The src:ipykernel-py2 package provides python-ipykernel, which is to be
> > removed for the Python 2 removal transition.
> 
> Cc-ing Gordon explicitly: the last time we spoke, Gordon wanted to
> keep the python2 stack of ipython/ikernel/jupyther for people to still
> have a py2 infrastructure to run their notebooks. Is this changed? are
> we ready to remove them all?

Hi, and thanks for pinging me on this.

I had originally wanted to hang onto ipykernel-py2 on the basis that it
didn't add much to the dependency tree not in any case blocked by
ipython [0] and is _probably_ useful for a certain amount of legacy
analysis code.

However, keeping support for python2 stuff around when it's blocking the
wider py2removal transition is not really a hill I want to die on. Given
that this bug has been filed, and I'm not actually now using the python2
version myself, I'm inclined to just let the RM happen.

Gordon

[0]: ipykernel -> jupyter-client -> jupyter-core -> pyzmq can be dropped
together, I think all other dependencies are blocked on at least
something else.


> 
> Regards,
> -- 
> Sandro "morph" Tosi
> My website: http://sandrotosi.me/
> Me at Debian: http://wiki.debian.org/SandroTosi
> Twitter: https://twitter.com/sandrotosi



Re: Separate lib and executable packages?

2020-02-11 Thread Gordon Ball
On Sat, Feb 08, 2020 at 08:54:27PM +0100, Ondrej Novy wrote:
> Hi,
> 
> so 8. 2. 2020 v 20:51 odesílatel Gordon Ball  napsal:
> 
> > Perhaps this is worth making an explicit recommendation for new packages
> > of this type, given that anything new _should_ be python3-only and we
> >
> 
> and what about pypy- prefix?

That's a good point.

Probably not many packages are likely to provide pypy{,3} invoking binaries
(ipython is probably actually a good candidate here) and so it probably
counts as an exceptional case which can reasonably bypass a recommendation.

I suppose you could:

 * Not ship any executables which invoke pypy. For pypy3, that appears
   to be the case today (nothing in /usr/bin using pypy3 except the
   interpreter itself).
 * Ship the library and python3 executable together (possibly with a
   Provides: for the executable), and the pypy3 executable in a
   separate package (since the library itself presumably
   doesn't want to depend on pypy3, and might need to depend on
   pypy-specific library packages). This saves 1 binary package but
   seems inconsistent. The actual implementation could be picked with
   update-alternatives.
 * Ship library, python3 executable and pypy3 executable separately.
   This seems more consistent but generates an extra binary package.

This is probably a good case to consider in this thread because ipython
and ipykernel are probably reasonable cases where a pypy-specialised
executable might make sense, and ipython at least depends only on
pure-python modules which work out-of-the-box with pypy.

> 
> -- 
> Best regards
>  Ondřej Nový



Re: Separate lib and executable packages?

2020-02-08 Thread Gordon Ball
On Sat, Feb 08, 2020 at 02:42:11AM +, Scott Kitterman wrote:
> 
> 
> On February 7, 2020 3:59:46 PM UTC, Mattia Rizzolo  wrote:
> >On Fri, Feb 07, 2020 at 11:36:00AM +0000, Gordon Ball wrote:
> >> I wonder if this split really makes sense; it feels like adding the
> >> overhead of an extra binary package to avoid not having a very small
> >> file in /usr/bin if you're only planning to use the library.
> >> 
> >> Does it seem reasonable to drop the executable package and just make
> >it
> >> a Provides: of the library package? (and is there any potential
> >breakage
> >> here that I'm overlooking)
> >
> >
> >Maybe not for ipython3, since that's very much tied to
> >python3-ipython3.
> >
> >BUT, as a user (even forgetting I'm also a DD) I was hurt many times by
> >executables in python-foo but wanting to use python3-foo instead, or by
> >executables that moved from python-foo to python3-foo and I had to fix
> >my own deployments, and whatnot.
> >
> >We are not going to have a python4 anytime soon (hopefully never), but
> >I
> >think that keeping a separate binary package makes sense for almost all
> >cases I can think of packages shipping executables under /usr/bin/.
> 
> Except: Every time we add a binary to Debian it impacts everyone.  The 
> packages file gets bigger and so updates get slower.  Although there's no 
> firm rule, the FTP Masters discourage addition of trivially small packages 
> and so your package might be rejected.
> 

Perhaps this is worth making an explicit recommendation for new packages
of this type, given that anything new _should_ be python3-only and we
hope for at least some time before another python major version becomes
an issue. "Packages which are primarily libraries but include an
executable should avoid multiple binary packages without good reason"? -
At least in the above linked style guide, since it's not normative
policy in any case.

For packages already split (eg, ipython3, python3-ipython), having
thought a bit more I suspect it's probably more trouble than it is worth
to drop a handful of binary packages - I'm wondering about eg, what
happens if the executable package is manually installed and the library
hence flagged as automatically installed; if the former than becomes a
Provides: of the latter, is it immediately a candidate for autoremoval?

> As an example, the dkimpy module that I maintain provides some scripts.  
> Originally they were provided in python-dkim.  Now they are provided in 
> python3-dkim (and since python-dkim is gone in unstable/testing, there's no 
> more chance of users being confused about which one to install to get the 
> scripts).
> 
> I recall a few questions by people that didn't know where to find the 
> scripts.  I could have had both packages provide it using 
> update-alternatives, but the problem never seemed severe enough to be worth 
> the work.
> 
> While splitting things out as suggested makes sense from the perspective of a 
> single package, it has negative implications for the distro as a whole.  
> Maintainers: please be mindful of the cumulative impact of the addition many 
> small packages that aren't strictly needed.
> 
> Scott K
> 



Separate lib and executable packages?

2020-02-07 Thread Gordon Ball
The library style guide [1] says under _Executables and library
packages_:

> Here are some recommendations. We do not have a standard (though maybe
> we should)

regarding whether a library with an associated executable should be
split across `foo` and `python3-foo` or just bundled into `python3-foo`.
Does anyone consider that there is now a standard either way?

In eg, ipython this means shipping `ipython3` (containing a ~300 byte
entrypoints script and a similarly sized man page) as a separate
package to the implementation in `python3-ipython`.

I wonder if this split really makes sense; it feels like adding the
overhead of an extra binary package to avoid not having a very small
file in /usr/bin if you're only planning to use the library.

Does it seem reasonable to drop the executable package and just make it
a Provides: of the library package? (and is there any potential breakage
here that I'm overlooking)

Gordon

[1]: https://wiki.debian.org/Python/LibraryStyleGuide



RFS: jupyter-core, jupyter-client

2017-12-23 Thread Gordon Ball
Almost-christmas RFS:

jupyter-core (4.3.0-1 -> 4.4.0-1)

New upstream; adds a new binary package ("jupyter", metapackage
depending on a sensible jupyter distribution).

jupyter-client (5.1.0-1 -> 5.2.0-1)

New upstream.



Packages can be found in mentors:

https://mentors.debian.net/package/jupyter-core
https://mentors.debian.net/package/jupyter-client

Or git:

https://anonscm.debian.org/git/python-modules/packages/jupyter-client.git
https://anonscm.debian.org/git/python-modules/packages/jupyter-core.git



Changelogs should be finalised for upload, but tags haven't been added
in git.

Thanks

Gordon



Re: Bug#878498: snakemake FTBFS with Python 3.6 as default

2017-12-07 Thread Gordon Ball
On 06/12/17 22:14, Andreas Tille wrote:
> control: tags -1 help
> 
> Hi,
> 
> I've upgraded snakemake in Git to latest upstream but its featuring the
> same issue.  I'm pretty sure you Python people know a simple answer for
> this issue.
To first order, the issue with the `export PATH` line can be fixed by
changing it to

`export PATH:=$(CURDIR)/bin:$(PATH)`

(since the `build` step generates `bin/snakemake` and the
PYBUILD_BEFORE_TEST just copies it into the build directory).

However, having done this still about half of the tests fail, with
various errors. Some are related to build dependencies
(google-cloud-sdk, boto3) but others are less obvious. Quite a few refer
to a stacktrace within scheduler.py; if this is trying to run a
multiprocess scheduler as part of a build-time test this sounds likely
to be an ongoing source of pain.

This also occurs if you make the temporary (but guaranteed to fail in
future) fix of just changing `pythonX.Y_3.5` to `3.6`, so I think there
are deeper issues with how the test suite needs to be run, which
probably need input from someone familiar with the package in question.

> 
> Any help would be welcome
> 
>   Andreas.
> 
> On Sat, Oct 14, 2017 at 05:33:22AM +0300, Adrian Bunk wrote:
>> Source: snakemake
>> Version: 3.10.0-1
>> Severity: serious
>> Tags: buster sid
>>
>> https://tests.reproducible-builds.org/debian/rb-pkg/unstable/amd64/snakemake.html
>>
>> ...
>> ==
>> ERROR: Failure: Exception (snakemake not in PATH. For testing, install 
>> snakemake with 'pip install -e .'. You should do this in a separate 
>> environment (via conda or virtualenv).)
>> --
>> Traceback (most recent call last):
>>   File "/usr/lib/python3/dist-packages/nose/failure.py", line 39, in runTest
>> raise self.exc_val.with_traceback(self.tb)
>>   File "/usr/lib/python3/dist-packages/nose/loader.py", line 418, in 
>> loadTestsFromName
>> addr.filename, addr.module)
>>   File "/usr/lib/python3/dist-packages/nose/importer.py", line 47, in 
>> importFromPath
>> return self.importFromDir(dir_path, fqname)
>>   File "/usr/lib/python3/dist-packages/nose/importer.py", line 94, in 
>> importFromDir
>> mod = load_module(part_fqname, fh, filename, desc)
>>   File "/usr/lib/python3.6/imp.py", line 235, in load_module
>> return load_source(name, filename, file)
>>   File "/usr/lib/python3.6/imp.py", line 172, in load_source
>> module = _load(spec)
>>   File "", line 684, in _load
>>   File "", line 665, in _load_unlocked
>>   File "", line 678, in exec_module
>>   File "", line 219, in 
>> _call_with_frames_removed
>>   File 
>> "/build/1st/snakemake-3.10.0/.pybuild/pythonX.Y_3.6/build/tests/tests.py", 
>> line 21, in 
>> raise Exception("snakemake not in PATH. For testing, install snakemake 
>> with "
>> Exception: snakemake not in PATH. For testing, install snakemake with 'pip 
>> install -e .'. You should do this in a separate environment (via conda or 
>> virtualenv).
>>
>> --
>> Ran 4 tests in 0.535s
>>
>> FAILED (errors=1)
>> E: pybuild pybuild:283: test: plugin distutils failed with: exit code=1: cd 
>> /build/1st/snakemake-3.10.0/.pybuild/pythonX.Y_3.6/build; python3.6 -m nose 
>> tests
>> dh_auto_test: pybuild --test --test-nose -i python{version} -p 3.6 returned 
>> exit code 13
>> debian/rules:18: recipe for target 'build' failed
>> make: *** [build] Error 25
>>
>>
>>
>> The problem is the the following line in debian/rules:
>>
>>   export PATH:=$(CURDIR)/.pybuild/pythonX.Y_3.5/build/bin:$(PATH)
>>
>> ___
>> Debian-med-packaging mailing list
>> debian-med-packag...@lists.alioth.debian.org
>> http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/debian-med-packaging
>>
> 



Re: ipython/jupyter issue [Was: RuntimeError: Kernel died before replying to kernel_info (#4116)]

2017-11-12 Thread Gordon Ball
On 10/11/17 17:28, Andreas Tille wrote:
> Hi,
> 
> On Thu, Nov 09, 2017 at 08:48:02PM +0100, Gordon Ball wrote:
>>
>> I just built statsmodels 0.8.0-6 in and amd64 sbuild chroot without
>> encountering this issue.
>>
>> Looking at the trace in #880245, it looks like the key error is
>>
>> zmq.error.ZMQError: Address already in use
>>
>> and the other
>>
>> RuntimeError: Kernel died before replying to kernel_info
>>
>> are just caused by the other process not being able to communicate. My
>> guess is that this is an environment heisenbug - container network
>> issues/socket exhaustion...
> 
> IMHO this would be a good reason to lower the severity of the bug, isn't
> it?

You can check this for yourself, but yes - it appears that it does not
consistently FTBFS, and therefore shouldn't be RC.

>  
>> It might be possible to get `ExecutePreprocessor` in
>> `tools/nbgenerate.py` to use IPC instead of TCP transport, which might
>> be more reliable for this sort of use (although it probably doesn't get
>> tested as much, so might have its own problems); this is a supported
>> option when invoking some CLI tools (eg, `jupyter console --transport
>> ipc --kernel python3`), but it's not obvious to me how to do it in this
>> case.
> 
> Any further hits?

I don't think I have time to dig into this at the moment, but given the
nbgenerate task is already run (docs/Makefile:58,70) with
`--allow_errors=True` to ignore errors within the notebooks, I'd just
patch the Makefile to ignore any errors from that command, which in the
worst case will result in some documentation notebooks not being populated.

> 
> Kind regards
> 
>  Andreas. 
> 



Re: ipython/jupyter issue [Was: RuntimeError: Kernel died before replying to kernel_info (#4116)]

2017-11-09 Thread Gordon Ball
On 09/11/17 15:27, Andreas Tille wrote:
> Hi,
> 
> statsmodels upstream claims that #880245 is a ipython/jupyter issue.
> 
> Is there anybody with some knowledge and knows how to fix this issue?

I just built statsmodels 0.8.0-6 in and amd64 sbuild chroot without
encountering this issue.

Looking at the trace in #880245, it looks like the key error is

zmq.error.ZMQError: Address already in use

and the other

RuntimeError: Kernel died before replying to kernel_info

are just caused by the other process not being able to communicate. My
guess is that this is an environment heisenbug - container network
issues/socket exhaustion...

It might be possible to get `ExecutePreprocessor` in
`tools/nbgenerate.py` to use IPC instead of TCP transport, which might
be more reliable for this sort of use (although it probably doesn't get
tested as much, so might have its own problems); this is a supported
option when invoking some CLI tools (eg, `jupyter console --transport
ipc --kernel python3`), but it's not obvious to me how to do it in this
case.

Gordon

> 
> Kind regards
> 
>Andreas.
> 
> - Forwarded message from Josef Perktold  -
> 
> Date: Thu, 09 Nov 2017 12:51:38 + (UTC)
> From: Josef Perktold 
> To: statsmodels/statsmodels 
> Cc: Andreas Tille , Author 
> Subject: Re: [statsmodels/statsmodels] RuntimeError: Kernel died before 
> replying to kernel_info (#4116)
> 
> This is a ipython/jupyter issue, and not related to statsmodels. You can ask 
> at their issue tracker.
> 



Re: RFS: jupyter components

2017-10-17 Thread Gordon Ball
On 17/10/17 13:31, Adam Cecile wrote:
> Hello Gordon,
> 
> We are really interested in getting notebook 5, any progress on this ?
> Is it possible to get your not-uploaded-yet packages (I did not find
> them on mentors.debian.net).

I have just uploaded the current RFS packages (ipython,
jupyter-notebook, jupyter-console, nbconvert) to mentors.d.n

There is also an ubuntu PPA (ppa:chronitis/jupyter) [1] containing
builds of these packages for xenial, zesty and artful.

They should also all be buildable from the git packaging repositories
(jupyter-notebook build-depends on the updated nbconvert package due to
build-time tests, the rest should build cleanly). [2]

[1]: https://launchpad.net/~chronitis/+archive/ubuntu/jupyter
[2]:
https://anonscm.debian.org/git/python-modules/packages/jupyter-notebook.git
; jupyter-console.git ; ipython4.git; nbconvert.git

> 
> Thanks in advance,
> 
> Best regards, Adam.
> 



Re: RFS: jupyter components

2017-10-10 Thread Gordon Ball


On 08/09/17 18:47, Julien Puydt wrote:
> Hi,
> 
> old thread but new things.
> 
> Le 28/08/2017 à 23:56, Gordon Ball a écrit :
> 
>>  * nbconvert: 5.2.1
>>
>>waiting on python-pandocfilters >= 1.4 (already in dpmt git, but
>>not yet uploaded)
> 
> The package is up to date, and I am theoretically allowed to upload it,
> but since it now produces one more binary, I am practically not allowed
> to upload it -- if someone from the list can sponsor, that would be nice.
> 
> If not, I'll just dput mentors and file a RFS to sponsorship-requests.

I guess this got lost in other things at the time. I think sponsorship
is still sought for:

 * ipython (5.1.0 -> 5.5.0)
 * nbconvert (4.2.0 -> 5.3.1)
 * jupyter-notebook (4.2.3 -> 5.1.0)
 * jupyter-console (5.0.0 -> 5.2.0)

Updating jupyter-notebook depends on nbconvert, the other packages
should be independent.



Re: Python 3 Statsmodels & Pandas

2017-09-18 Thread Gordon Ball
On 18/09/17 09:48, Stuart Prescott wrote:
> Hi Diane,
> 
> On Sunday, 17 September 2017 22:14:18 AEST Diane Trout wrote:
>> I just did it that way because it was the least disruptive change I
>> could make that would let me build and test the package.
> 
> Sure, that's entirely sensible.
> 
>> In my experience I'm much more likely to to notice a build time test
>> failure than one from the CI system. 
> 
> Absolutely agreed. I'm thinking that this is a rather exceptional situation 
> because of the circular dependency and the fall-out that we regularly see 
> from 
> that.
> 
>> What do other people think? If there are autopkgtests, should you still
>> let dh_auto_test run tests?
> 
> (I know I'm not the 'other people' from whom you solicit replies, I was just 
> perhaps unclear in what I was musing on before so I want to be more clear now)
> 
> Like you, I would *normally* run all tests in both places: buildd tests catch 
> severe problems right now; ci.d.n reruns the tests each time new versions of 
> dependencies are uploaded too and later breakage is caught.
> 
> Perhaps this is not a normal situation, however. Either one of "circular 
> dependencies" or "packages that often FTBFS on one or more architectures" 
> would be unusual enough to justify doing something different. All I am 
> wondering (from my position of ignorance!) if in this case, perhaps the tests 
> that cause the circular dependency can be disabled or xfailed, with the 
> remaining tests run as normal.

I now prefer to use autopkgtest in place of build-time tests; the former
has the advantages:

 * testing the installed package rather than the source tree, which
ensures that the binary package has sufficient dependencies and the
correct (or at least sufficient) contents
 * avoids dependency loops, as noted
 * avoids the need for workarounds required because of the location of
the files being tested at build time (see nbconvert issues with
entry_points not being found at build time)
 * can detect external breakage between uploads via ci.d.n

The downsides are that:

 * boilerplate is required (copying test files out of the source tree to
$AUTOPKGTEST_TMP to avoid testing against the source tree, instead of
the installed package)
 * extended build times if using autopkgtest as a post-build hook
(installing a chroot multiple times, if using sbuild/similar), and other
devs aren't guaranteed to run such tests at build/upload time

I wonder if it is possible (or a worthwhile use of time) to try and
provide an extended autodep8 python mode which runs the equivalent of
dh_auto_test as an autopkgtest, for packages which would opt-in with, eg
"Testsuite: autopkgtest-pkg-python-pytest|nose", rather than the basic
namespace-can-be-imported test.

Gordon



Re: RFS: jupyter components

2017-09-04 Thread Gordon Ball
On 04/09/17 19:23, Julien Puydt wrote:
> Hi,
> 
> Le 04/09/2017 à 18:17, Julien Puydt a écrit :
> 
>> There are quite many things to fix to correct before uploading, though :
>> lintian is not happy.
> 
> Lintian still complains about two things:
> W: nbconvert source: newer-standards-version 4.1.0 (current is 4.0.0)
> I: nbconvert source: testsuite-autopkgtest-missing

My understanding is that this field is unnecessary, if d/tests/control
exists [policy 5.6.30], so it's only necessary for the other packages
for which we're relying on autodep8 to generate a autopkgtest.

> 
> The first is ok.
> 
> The second is strange: autopkgtest is quite happy with the dsc...
> 
> I would say : ready to upload as soon as we have a newer nbsphinx.

Sounds reasonable. Thanks for looking into this one.

> 
> Snark
> 

[policy 5.6.30]:
https://www.debian.org/doc/debian-policy/ch-controlfields.html#s-f-Testsuite

This field is automatically added to Debian source control files by dpkg
when a debian/tests/control file is present in the source package.



Re: RFS: jupyter components

2017-09-04 Thread Gordon Ball



On Mon, 4 Sep 2017, Julien Puydt wrote:


Hi,

Le 28/08/2017 à 23:56, Gordon Ball a écrit :


 * nbconvert: 5.2.1

   waiting on python-pandocfilters >= 1.4 (already in dpmt git, but
   not yet uploaded)


I updated it to latest upstream -- the build doesn't fail because of
python-pandocfilters afaict, but because for some reason entrypoints
don't get activated correctly. I still pushed my changes because I'm
sure I'm just missing something stupid...


Yes - sorry, should have added some more text for this one. I didn't find 
a solution to entry_points not working during dh_auto_test, so my solution 
to that was to disable the build-time test and run it instead as an 
installed autopkgtest, for which entry_points should work correctly.


The issue with pandocfilters arose because I always run autopkgtest as a 
build step, but while there is an easy way of getting sbuild to use other 
locally built packages (--extra-package=/dir/containing/debs), I haven't 
found a way of getting autopkgtest to do the same - so I can satisfy build 
dependencies with extra, locally-built packages, but not the autopkgtest 
dependencies.




Has anyone else encountered the issue with tests which rely on 
entry_points and how to make them work during dh_auto_test? Presumably 
there must be some environment/pybuild magic which will make them look in 
the right place.




Snark on #debian-python


Re: RFS: jupyter components

2017-08-29 Thread Gordon Ball



On Tue, 29 Aug 2017, Julien Puydt wrote:


Hi,

Le 28/08/2017 à 23:56, Gordon Ball a écrit :

Hello

The following packages should be ready for upload, if someone would be
willing to check and sponsor the uploads:

 * ipython 5.4.0-1

   IPython 6.x is now available, but is python3 only. For the moment,
   the existing ipython source package will be the 5.x series, and at
   some point it will cease to build the python3 package and a new
   source package based on ipython6 will take that binary over.

   Question for previous packagers: we currently ship a custom
   ipython.sh script as /usr/bin/ipython[3] instead of using the
   entry-points script that would otherwise be installed, but I can't
   find the rationale documented - faster startup?

 * jupyter-console: 5.2.0-1

   I tried out converting this one to gbp-pq (all others are still in
   git-dpm format)

 * jupyter-core: 4.3.0-1
 * nbformat 4.4.0-1
 * jupyter-client: 5.1.0-1


The remaining packages are currently blocked:

 * nbconvert: 5.2.1

   waiting on python-pandocfilters >= 1.4 (already in dpmt git, but
   not yet uploaded)

 * jupyter-notebook: 5.0.0

   working package available, but with some functionality limited due to
   unpackaged or out-of-date javascript libraries


I haven't dared touching those packages recently (and especially not
updating them to later upstream) precisely because I knew I would risk
killing Python2 support.


None of these updates remove or downgrade python 2.7 support. This was 
discussed a bit at https://lists.debian.org/debian-python/2017/06/msg00047.html;

all the 4.x and 5.x versions still support python 2.7.

This question is definitely going to come up in the not-too-distant 
future, but for the moment only IPython (v6.1) has released a 
python3-only version, and the watch file for the existing ipython source 
package has been restricted to only fetch versions <6. My proposal (see 
the thread linked above) is to introduce a separate source package 
("ipython6"?) for this version, and build only the python2 version from 
the existing source package.



Could you be more explicit about which javascript libraries are
unpackaged or out-of-date? I've been struggling since pretty long to
keep some of the javascript packages I made for jupyter up to date, so
perhaps at one point that effort might help...


Yes, I've seen you've been doing a lot of work on the javascript side - 
thanks for your work there.


As of notebook 5.0.0, the issues are:

 * jqueryui

   notebook uses an old version (1.10) whereas debian has 1.12
   (fixed with patches, but potentially affects compatibility of
   addons). Probably wontfix unless we embed jqueryui 1.10.

 * codemirror

   debian has 5.4, notebook requires 5.22 (but nonetheless works with 5.4,
   the API appears to be sufficiently stable). I investigated updating
   this; current is 5.29, but updating to > 5.19 requires rollup + various
   plugins. Updating to 5.19 for now to get at least some of the interim
   fixes might be pragmatic.

 * preact, preact-compat, proptypes

   none of these are currently packaged, and these are typescript libs
   with a fairly extensive dependency stack. These are only used to
   implement a shortcut editor, which the current notebook package just
   disables to avoid the dependencies - but shipping a deliberately
   crippled version is not ideal, even though it's not core functionality.



Snark on #debian-python and #debian-js


RFS: jupyter components

2017-08-28 Thread Gordon Ball
Hello

The following packages should be ready for upload, if someone would be
willing to check and sponsor the uploads:

 * ipython 5.4.0-1

   IPython 6.x is now available, but is python3 only. For the moment,
   the existing ipython source package will be the 5.x series, and at
   some point it will cease to build the python3 package and a new
   source package based on ipython6 will take that binary over.

   Question for previous packagers: we currently ship a custom
   ipython.sh script as /usr/bin/ipython[3] instead of using the
   entry-points script that would otherwise be installed, but I can't
   find the rationale documented - faster startup?

 * jupyter-console: 5.2.0-1

   I tried out converting this one to gbp-pq (all others are still in
   git-dpm format)

 * jupyter-core: 4.3.0-1
 * nbformat 4.4.0-1
 * jupyter-client: 5.1.0-1


The remaining packages are currently blocked:

 * nbconvert: 5.2.1

   waiting on python-pandocfilters >= 1.4 (already in dpmt git, but
   not yet uploaded)

 * jupyter-notebook: 5.0.0

   working package available, but with some functionality limited due to
   unpackaged or out-of-date javascript libraries

Thanks

Gordon



Re: IPython/Jupyter plans for buster

2017-06-26 Thread Gordon Ball
I don't have more information than was in the [1] link below, but my 
reading of it is:


8<

All ipython/jupyter components will drop python 2.7 support from version 
6.


For ipython, ipykernel and dependencies ipython-genutils, traitlets the 
current, python-2.7 supporting branch gets extended support until July 
2019 (needed to run user-level python 2.7 code).


For components the user is meant to use as an application rather than an 
importable library (notebook, nbconvert) there will be no extended python 
2 support.


For components which might be useful to other projects (jupyter-core, 
jupyter-client, nbformat) there *might* be extended support of the current 
branch, but it isn't clear.


8<

I think we want to follow the same strategy: keep support for running 
python 2 code in interactive sessions and notebooks as long as possible, 
but for other parts like the notebook server which the user is not 
expected to import from, we drop python 2 support when upstream no longer 
provides it.


Which interfaces in jupyter does sagemath use? Is it just a kernel or does 
it import directly from notebook/nbformat/nbconvert, etc?


When you say (2), since I failed to number them originally, I think you 
meant:


split ipython source package,
python-ipython stays as version 5.x,
python3-ipython gets version 6.x from a new source package

(correct?)

Gordon

On Sun, 25 Jun 2017, Ximin Luo wrote:


Hey, it depends on the details.

I think (2) would be the best option, I think it's not worth the effect trying to set up 
an "alternatives" system, we have enough stuff to do already without needing to 
test all of this complexity.

One thing we have to watch out for, is if their LTS versions stays on older 
libraries that are incompatible with the newest versions (cough cough NPM, but 
also python libs might be affected). Hopefully this would only happen rarely.

Are they dropping python2 support for the core components as well? It would be 
pretty annoying to have to package two of *everything* just for python2 
support. (SageMath definitely needs it.)

X

Gordon Ball:

Hello

We currently have IPython 5.1 in the archive.

Upstream has announced [1][2] IPython/Jupyter 5.x as an LTS branch (36
months support, ending July 2019), and the last version to support
Python 2.7. The first releases of IPython 6 (supporting Python 3 only)
are now available.

There seem to be several possibilities:

 * Stick with 5.x only for this cycle
 * Stick with 5.x as the default for 2 and 3 and provide an alternative
6.x package for python 3
 * Stick with 5.x only for python 2.7 and create a separate ipython 6.x
source package for python 3

This decision probably only needs to be made for the "user visible"
components of jupyter (ipython, ipykernel and their dependencies); the
application/script parts (nbconvert, jupyter-notebook, qtconsole, etc)
can probably be moved to be python 3 only when required.

Gordon

[1]:
https://github.com/jupyter/roadmap/blob/master/accepted/migration-to-python-3-only.md
[2]: http://www.python3statement.org/

postscript: I've just pushed updates to git for several of the jupyter
packages
 * ipython -> 5.4.0
 * jupyter-client -> 5.1.0
 * nbconvert -> 5.2.1 (blocked on pandocfilters >= 1.4)
 * ipykernel -> 4.6.1




--
GPG: ed25519/56034877E1F87C35
GPG: rsa4096/1318EFAC5FBBDBCE
https://github.com/infinity0/pubkeys.git





IPython/Jupyter plans for buster

2017-06-24 Thread Gordon Ball
Hello

We currently have IPython 5.1 in the archive.

Upstream has announced [1][2] IPython/Jupyter 5.x as an LTS branch (36
months support, ending July 2019), and the last version to support
Python 2.7. The first releases of IPython 6 (supporting Python 3 only)
are now available.

There seem to be several possibilities:

 * Stick with 5.x only for this cycle
 * Stick with 5.x as the default for 2 and 3 and provide an alternative
6.x package for python 3
 * Stick with 5.x only for python 2.7 and create a separate ipython 6.x
source package for python 3

This decision probably only needs to be made for the "user visible"
components of jupyter (ipython, ipykernel and their dependencies); the
application/script parts (nbconvert, jupyter-notebook, qtconsole, etc)
can probably be moved to be python 3 only when required.

Gordon

[1]:
https://github.com/jupyter/roadmap/blob/master/accepted/migration-to-python-3-only.md
[2]: http://www.python3statement.org/

postscript: I've just pushed updates to git for several of the jupyter
packages
 * ipython -> 5.4.0
 * jupyter-client -> 5.1.0
 * nbconvert -> 5.2.1 (blocked on pandocfilters >= 1.4)
 * ipykernel -> 4.6.1



Re: replacement for ipython(3)-notebook?

2016-11-09 Thread Gordon Ball
On 09/11/16 16:49, Julien Puydt wrote:
> Hi,
> 
> On 09/11/2016 16:24, Zack Weinberg wrote:
>> ipython(3)-notebook has been dropped from unstable with the transition
>> to ipython 5.0.0.  What package now provides this functionality?
> 
> Gordon Ball is working on packaging jupyter-notebook.
> 
> It's part of the effort to package sagemath in Debian -- the discussion
> happens mostly on the debian-science-sagemath mailing-list.
> 
> Snark on #debian-python
> 

Indeed.

The package name will be jupyter-notebook. It will hopefully be uploaded
soon (one or two minor issues still to fix), but you can find the
packaging at [1] if you want to build it yourself in the meanwhile.

There isn't a direct replacement for the python and python3 versions -
after the jupyter <-> ipython split you should also install
python(3)-ipykernel to support running python or python3 in the notebook.

Gordon

[1]:
https://anonscm.debian.org/git/python-modules/packages/jupyter-notebook.git



Re: Packaging/installing jupyter kernels

2016-08-05 Thread Gordon Ball
On 05/08/16 15:17, Julien Puydt wrote:
> Hi,
> 
> On 05/08/2016 14:07, Gordon Ball wrote:
>> Jupyter has been in experimental for a while and presumably will make it
>> to unstable in the not too distant future.
> 
> I don't think that will happen that early... there are a few things not
> ready yet and what is there isn't perfect yet. Help is welcome!

I might be able to help. What are the outstanding issues/blockers?

(I'm not actually a DPMT/PAPT member, but this might be a good time to
join).

> 
>> Once that is done, what is the correct way for packages providing a
>> jupyter kernel to install it?
>>
>>  * manually install kernel.json in /usr/share/jupyter/kernels/?
>>  * build-depend on python3-jupyter-core and call `jupyter kernelspec
>> install` during build?
>>  * runtime-depend and install in postinst instead?
>>  * (something else)
> 
> No idea.

Example: xonsh (a python-based shell) includes a kernel and tries to
install it from setup.py (currently disabled by not having jupyter
available at build time). But it sounds like the question is premature.

> 
> Snark on #debian-python
> 
chronitis on same



Packaging/installing jupyter kernels

2016-08-05 Thread Gordon Ball
Jupyter has been in experimental for a while and presumably will make it
to unstable in the not too distant future.

Once that is done, what is the correct way for packages providing a
jupyter kernel to install it?

 * manually install kernel.json in /usr/share/jupyter/kernels/?
 * build-depend on python3-jupyter-core and call `jupyter kernelspec
install` during build?
 * runtime-depend and install in postinst instead?
 * (something else)

Gordon



Re: Status of ipython-qtconsole

2016-06-11 Thread Gordon Ball
On 11/06/16 17:45, PICCA Frederic-Emmanuel wrote:
> Hello Julien
> 
>> Not as far as I know.  I'd be happy to review a package if that would
>> help.
> 
> I just pushed here my wip package
> 
> git clone 
> git+ssh://git.debian.org/git/python-modules/packages/python-qtconsole.git
> 
> I have debian/TODO.org file where I put the remaining things to do.
> 
> I would be glade if your could do a review (it si still work in progress)
> But this is my first package for python-modules
> 
> Here the question which need to be answered before uploaded even to 
> experimental
> 
> * qtconsole provide jupyter-qtconsole
> 
>   - [ ] for python3 what is the right scheme
> jupyter-qtconsole3 or jupyter3-qtconsole
> 
> the problem is that jupyter-core provide only jupyter and not jupyter3
> so for now if I start
> jupyter qtconsole it start a Python2 kernel
> 
> with jupyter-qtconsole3 -> jupyter qtconsole3 -> Python3
> with jupyter3-qtconsole -> jupyter 
> 

I think in jupyter this is meant to be handled by the --kernel option - ie

jupyter qtconsole --kernel python2
jupyter qtconsole --kernel python3

given the existence of non-python kernels (irkernel, ihaskell, iruby,
ijavascript, etc...)

> 
> Cheers,
> Fred
> 
> 
> Ps: I will add the -doc file  :)
>