Re: Bug#1060201: qa.debian.org: [udd] carnivore_emails is lacking lots of entries

2024-01-07 Thread Andreas Tille
Am Sun, Jan 07, 2024 at 05:58:36PM +0100 schrieb Lucas Nussbaum:
> 
> See https://salsa.debian.org/qa/udd/-/blob/master/udd/carnivore_gatherer.py.

Found this meanwhile.
 
> > This statement could be easily turned into injects and would be a first 
> > approach to enhance
> > the carnivore_emails table with more ids.
> > 
> > If you give some green light I could create such a statement and maybe more 
> > enhancements
> > by looking into more tables.
> 
> Please don't: the correct way to fix that is to improve the source data
> (in quantz:/org/qa.debian.org/carnivore)

I have not found this code in Salsa.  Is it true that the Python2 code I
can find at

   quantz:/org/qa.debian.org/carnivore

is the code source of carnivore?

Kind regards
   Andreas.

-- 
http://fam-tille.de



Re: Bug#1060201: qa.debian.org: [udd] carnivore_emails is lacking lots of entries

2024-01-07 Thread Andreas Tille
Control: usertag -1 udd

Am Sun, Jan 07, 2024 at 01:38:35PM +0100 schrieb Andreas Tille:
> Package: qa.debian.org
> Severity: normal
> 
> Hi,
> 
> I tried to analyse closed bugs using done_email via carnivore_emails but 
> realised
> that this table is lacking lots of entries where I could easily add several 
> from
> my own memory:
> 
> SELECT done_email, COUNT(*) FROM (
> SELECT done_email FROM archived_bugs WHERE id IN (SELECT id FROM (SELECT 
> ab.id, ce.id AS ce_id
>   FROM archived_bugs ab
>   LEFT JOIN carnivore_emails ce ON ce.email = ab.done_email
> ) noid WHERE ce_id IS NULL ) AND done_email NOT IN 
> ('ftpmas...@ftp-master.debian.org','nore...@salsa.debian.org','unknown')
> ) miss GROUP BY done_email
> ORDER BY count DESC
> ;
> 
>done_email   | count 
> +---
>  goth...@sapo.pt|  5221
>  ba...@quantz.debian.org|  2665
>  d...@cs.tu-berlin.de   |  2555
>  kit...@northeye.org|  2371
>  m...@linux.it|  2056
>  herb...@gondor.apana.org.au|  1900
>  da...@merkel.debian.org|  1788
>  daniel.baum...@progress-technologies.net   |  1393
>  m...@stro.at|  1327
>  b...@fs.tum.de |  1278
>  debian-...@adam-barratt.org.uk |  1155
>  cche...@cheney.cx  |  1031
>  sramac...@respighi.debian.org  |   992
> ...
>  zweistei...@gmx.de | 1
> (9075 rows)
> 
> I wonder how the carnivore_* tables are filled and whether you want me
> to draft some INSERT statements filling up the most relevant emails
> where I would volunteer to sort the according IDs.
> 
> Kind regards
>Andreas.

BTW, its probably pretty easy to resolve >900 of these missing e-mails:

CREATE TEMPORARY TABLE missing_in_carnivore_emails AS
SELECT done_email, COUNT(*) FROM (
SELECT done_email FROM archived_bugs WHERE id IN (SELECT id FROM (SELECT ab.id, 
ce.id AS ce_id
  FROM archived_bugs ab
  LEFT JOIN carnivore_emails ce ON ce.email = ab.done_email
) noid WHERE ce_id IS NULL ) AND done_email NOT IN 
('ftpmas...@ftp-master.debian.org','nore...@salsa.debian.org','unknown')
) miss GROUP BY done_email
ORDER BY count DESC
;

SELECT DISTINCT done_name, done_email, cn.id FROM
  (SELECT BTRIM(done_name, '"') AS done_name, done_email FROM archived_bugs) ab
  LEFT JOIN carnivore_names cn ON cn.name = ab.done_name
  WHERE done_email in (SELECT done_email FROM missing_in_carnivore_emails WHERE 
count > 10)
AND done_name IS NOT NULL AND done_name != ''
AND id IS NOT null
;

   done_name|   done_email  
  |  id  
-+-+--
 Camm Maguire| c...@enhanced.com
   | 6158
 Ross Vandegrift | r...@kallisti.us 
   |  734
 Michael Ablassmeier | a...@grinser.de  
| 2751
 Neil McGovern   | maul...@halon.org.uk 
   | 3708
 Torsten Landschoff  | tors...@pclab.ifg.uni-kiel.de
   | 6320
 Agney Lopes Roth Ferraz | ag...@users.sourceforge.net  
   | 4000
 Galen Hazelwood | gal...@micron.net
   | 1241
 Anand Kumria| wildf...@progsoc.org 
   | 4175
 Adam Rogoyski   | rogoy...@cs.utexas.edu   
   | 1102
 Christophe Barbe| christophe.ba...@ufies.org   
   | 2054
 Yann Dirson | ydir...@fr.alcove.com
   | 5804
 Arjan Oosting   | arjanoost...@home.nl 
   | 5366
 Julian Gilbey   | j.d.gil...@qmw.ac.uk 
   | 3875
 Norman Jordan   | njor...@shaw.ca  
   | 3513
 Michael Piefel  | pie...@informatik.hu-berlin.de   
   | 
 Frederic Lepied | lep...@debian.org
   | 2460
...
 Neil Williams   | li...@codehelp.co.uk 
   | 1552
 Christopher Martin  | chrsm...@freeshell.org   
   | 2754
 Andrew Lenharth | a...@cs.wa

Bug#1060201: qa.debian.org: [udd] carnivore_emails is lacking lots of entries

2024-01-07 Thread Andreas Tille
Package: qa.debian.org
Severity: normal

Hi,

I tried to analyse closed bugs using done_email via carnivore_emails but 
realised
that this table is lacking lots of entries where I could easily add several from
my own memory:

SELECT done_email, COUNT(*) FROM (
SELECT done_email FROM archived_bugs WHERE id IN (SELECT id FROM (SELECT ab.id, 
ce.id AS ce_id
  FROM archived_bugs ab
  LEFT JOIN carnivore_emails ce ON ce.email = ab.done_email
) noid WHERE ce_id IS NULL ) AND done_email NOT IN 
('ftpmas...@ftp-master.debian.org','nore...@salsa.debian.org','unknown')
) miss GROUP BY done_email
ORDER BY count DESC
;

   done_email   | count 
+---
 goth...@sapo.pt|  5221
 ba...@quantz.debian.org|  2665
 d...@cs.tu-berlin.de   |  2555
 kit...@northeye.org|  2371
 m...@linux.it|  2056
 herb...@gondor.apana.org.au|  1900
 da...@merkel.debian.org|  1788
 daniel.baum...@progress-technologies.net   |  1393
 m...@stro.at|  1327
 b...@fs.tum.de |  1278
 debian-...@adam-barratt.org.uk |  1155
 cche...@cheney.cx  |  1031
 sramac...@respighi.debian.org  |   992
...
 zweistei...@gmx.de | 1
(9075 rows)

I wonder how the carnivore_* tables are filled and whether you want me
to draft some INSERT statements filling up the most relevant emails
where I would volunteer to sort the according IDs.

Kind regards
   Andreas.



Re: poco library is "basically orphaned", newer version available, Mumble has poco dependency

2024-01-07 Thread Andreas Tille
Hi Chris,

Am Tue, Jan 02, 2024 at 09:32:59PM -0500 schrieb Chris Knadle:
> I'm in kind of a tight spot and hope to get a "clearance to proceed" or
> reasonable guidance on how to handle this situation.
> 
> All recent versions of Mumble (>= 1.4) build-depend on poco c++ libraries
> which are RC buggy in Debian due to being outdated (depends on pcre3 which
> is slated to be removed -- pcre2 is the newer replacement). Poco is
> "basically orphaned" -- the last communicating developer has stated in
> Bug#89 that he removed himself from the package uploaders in the Git
> repo, and that all the other listed maintainers have not been heard from in
> "a long time" (it's apparently been years) -- but the package has not yet
> been orphaned via an upload.
> 
> There seem to be a total of 5 packages reverse-depending on poco libraries:
> clickhouse, clamfs, gm-assistant, gpsshogi, mumble -- all of these are
> blocked from migrating to Testing because of poco.
> 
> Here's what I would wish to do "in a perfect world":
> 
> 1. I'd like to do an NMU upload to orphan the package. This is required to
> get the package on the "orphaned" list to notify developers that a new
> maintainer is needed.
> 
> 2. I'd like to do an NMU upload of an updated version of the library,
> preferably with some help from a DD that has done library releases to insure
> the proper release processes are done. This seems like how things would be
> done in the Ubuntu world where packages don't have official maintainers.
> Updating the library would allow the packages depending on it to migrate,
> and would remove one of the 16 reverse-dependencies of libpcre3-dev.

This sounds all sensible.  Make sure you do so in its Git repository[1].
If you need help you might actually specify in more detail what help is
needed - possibly also asking on debian-ment...@lists.debian.org.

Kind regards
Andreas.


[1] https://salsa.debian.org/debian/poco 

-- 
http://fam-tille.de



Bug#1055550: Removal of Python3 package of redland-bindings breaks mozilla-devscripts

2024-01-05 Thread Andreas Tille
Hi Doko,

thanks for working on the QA upload to fix #100 and #1056518 of
redland-bindings by simply removing the Python3 support.  Unfortunately
it breaks mozilla-devscripts and thus it cant migrate to testing[1].

Kind regards
Andreas.


[1] https://qa.debian.org/excuses.php?package=redland-bindings

-- 
http://fam-tille.de



Re: Bug#1057778: qa.debian.org: [udd] some names are not stripped from blanks and quotes (Was: UDD contains names where spaces are not stripped)

2023-12-08 Thread Andreas Tille
Control: usertag -1 udd

The problem was discussed on Debian-QA list.  The discussion starts here:

https://lists.debian.org/debian-qa/2023/12/msg7.html

I opened this bug report to keep a record of the discussion in BTS.

There is one statement by Lucas Nussbaum[1] defending the current
situation:

> It has been like that for about 15 years. I'm not sure changing the API
> because you think is wrong is a good idea.

I do not consider strings injected into a database are comparable to
an API.  The fact that this issue exists since 15 years does not mean
that it is the right way to do.  Some bugs are just discovered after
that long time since the trigger for the issue might be quite rare.

I personally do not see any potential break of any application by
stripping names to be more easily parseable.  I'd be happy if someone
could convince me otherwise by for instance describing applications that
are relying on different spellings of the very same name.

Kind regards
Andreas.


[1] https://lists.debian.org/debian-qa/2023/12/msg00016.html

-- 
http://fam-tille.de



Bug#1057778: qa.debian.org: [udd] some names are not stripped from blanks and quotes

2023-12-08 Thread Andreas Tille
Package: qa.debian.org
Severity: normal

Hi,

in some tables names are not stripped from spaces.  Example:

udd=> select '"' || u.name || '"' as name_with_spaces, uploader from uploaders 
u where name like '% ' or name like ' %' ;
 name_with_spaces | uploader  
--+---
 " Mehdi Dogguy"  |  Mehdi Dogguy 
 " David Paleino" |  David Paleino 
 " Stéphane Glondu"  |  Stéphane Glondu 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Stefano Zacchiroli"    |  Stefano Zacchiroli 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 "Andreas Tille  "| Andreas Tille   
 " LI Daobing"|  LI Daobing 
 " David Paleino" |  David Paleino 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 "Colin Tuckley " | Colin Tuckley  
 "Colin Tuckley " | Colin Tuckley  
 "Colin Tuckley " | Colin Tuckley  
(20 rows)

in other tables names are containing quotes that should be stripped as well:

select distinct done, done_name, done_email, owner, owner_name, owner_email 
from archived_bugs where done_name like '%"%' or owner_name like '%"%' order by 
done_name;
 done   
   |  done_name 
 |   done_email
|  owner
  | owner_name  
| owner_email  
---+-+-+-+-+--
 
   |
 | der...@debian.org   
| "vane...@gmail.com"
  | "vane...@gmail.com" 
| vane...@gmail.com

   |
 | twer...@debian.org  
| "Varun Hiremath" 
  | "Varun Hiremath"
| varunhirem...@gmail.com
 alexan...@belikoff.net (Alexander L. Belikoff) 
   |
 | alexan...@belikoff.net  
| "Alexander L. Belikoff"   
  | "Alexander L. Belikoff" 
| alexan...@belikoff.net
 a...@debian.org (Andreas B. Mundt) 
   |
 | a...@debian.org 
| "Andreas B. Mundt"   
  | "Andreas B. Mundt"  
| a...@debian.org
 antoine.romain.dum...@gmail.com (Antoine R. Dumont (@ardumont))
   |
 | antoine.romain.dum...@gmail.com 
| "Antoine R. Dumont"  
  | "Antoine R. Dumont" 
 

Re: UDD contains names where spaces and quotes are not stripped

2023-12-07 Thread Andreas Tille
Am Thu, Dec 07, 2023 at 08:36:12PM +0100 schrieb Lucas Nussbaum:
> On 07/12/23 at 20:24 +0100, Andreas Tille wrote:
> > Am Thu, Dec 07, 2023 at 07:59:38PM +0100 schrieb Lucas Nussbaum:
> > > On 07/12/23 at 09:58 +0100, Andreas Tille wrote:
> > > > 
> > > > udd=> select '"' || u.name || '"' as name_with_spaces, uploader from 
> > > > uploaders u where name like '% ' or name like ' %' ;
> > > >  name_with_spaces | uploader  
> > > > --+---
> > > >  " Mehdi Dogguy"  |  Mehdi Dogguy 
> > > >  " David Paleino" |  David Paleino 
> > > >  " Stéphane Glondu"  |  Stéphane Glondu 
> > > >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> > > >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> > > >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> > > >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> > > >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> > > >  "Andreas Tille  "| Andreas Tille   
> > > >  " LI Daobing"|  LI Daobing 
> > > >  " David Paleino" |  David Paleino 
> > > >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> > > >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> > > >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> > > >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> > > >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> > > >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> > > >  "Colin Tuckley " | Colin Tuckley  
> > > >  "Colin Tuckley " | Colin Tuckley  
> > > >  "Colin Tuckley " | Colin Tuckley  
> > > > (20 rows)
> > > > ...
> > > >UPDATE uploaders SET name = trim(name), uploader = trim(name) || ' ' 
> > > > || email WHERE name like ' %' or name like '% ' ;
> > > > 
> > 
> > 
> > BTW:  I found 
> > 
> > udd=> SELECT count(*), name FROM (SELECT CASE WHEN changed_by_name = '' 
> > THEN maintainer_name ELSE changed_by_name END AS name FROM upload_history) 
> > uh WHERE name ilike '%tille%'  group by name;
> >  count | name  
> > ---+---
> >  16524 | Andreas Tille
> > (1 Zeile)
> > 
> > So why do I have 8707 uploads per uploaders but 16524 per upload_history?

???

> > Is my assumption wrong that both values should match (modulo some wrongly
> > spelled names)

Could you please comment on these different results?
 
> If you look at the uploaders table, there are three columns:
> - 'uploader', than contains the raw data
> - 'name' and 'email' that contain the parsed (and trimmed) data
> 
> udd=> select uploader, name, email, count(*) from uploaders where uploader 
> ilike '%tille%' group by 1,2,3;
>   uploader  |  name   |  email   | 
> count 
> +-+--+---
>  Andreas Tille| Andreas Tille   | ti...@debian.org |  
> 8785
>  Andreas Tille| Andreas Tille   | andr...@an3as.eu |
>  1
>  Andreas Tille| Andreas Tille   | ti...@debian.org |
>  1
> 
> So, just use name and/or email?

Well, I do not seek for a solution for this (non-)problem.  I simply
think that not stripping values from spaces before injecting these into
UDD is wrong.  I simply stumbled upon this when I did the query above.

I stumbled upon another reason which might be even worse:

select distinct done, done_name, done_email, owner, owner_name, owner_email 
from archived_bugs where done_name like '%"%' or owner_name like '%"%' order by 
done_name;
 done   
   |  done_name 
 |   done_email
|  owner
  | owner_name  
| owner_email  
---+

Re: UDD contains names where spaces are not stripped

2023-12-07 Thread Andreas Tille
Am Thu, Dec 07, 2023 at 07:59:38PM +0100 schrieb Lucas Nussbaum:
> On 07/12/23 at 09:58 +0100, Andreas Tille wrote:
> > Hi,
> > 
> > by chance I realised that the uploaders table contains some names where 
> > names
> > are not stripped:
> > 
> > udd=> select '"' || u.name || '"' as name_with_spaces, uploader from 
> > uploaders u where name like '% ' or name like ' %' ;
> >  name_with_spaces | uploader  
> > --+---
> >  " Mehdi Dogguy"  |  Mehdi Dogguy 
> >  " David Paleino" |  David Paleino 
> >  " Stéphane Glondu"  |  Stéphane Glondu 
> >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> >  " Stefano Zacchiroli"    |  Stefano Zacchiroli 
> >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> >  "Andreas Tille  "| Andreas Tille   
> >  " LI Daobing"|  LI Daobing 
> >  " David Paleino" |  David Paleino 
> >  " Stefano Zacchiroli"|  Stefano Zacchiroli 
> >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> >  " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
> >  "Colin Tuckley " | Colin Tuckley  
> >  "Colin Tuckley " | Colin Tuckley  
> >  "Colin Tuckley " | Colin Tuckley  
> > (20 rows)
> > ...
> >UPDATE uploaders SET name = trim(name), uploader = trim(name) || ' ' || 
> > email WHERE name like ' %' or name like '% ' ;
> > 
> 
> Uploaders is refreshed every few hours from archive data, so a one-time
> UPDATE would not help. UDD usually tries to preserve inaccuracies, so
> those might be interesting for QA work.

OK.

> In your case, why don't you use the email address to identify uploaders?

Since this also does not work:

udd=> SELECT count(*), uploader FROM uploaders WHERE name ilike '%tille%' GROUP 
BY uploader;
 count |  uploader  
---+
 1 | Andreas Tille   
 1 | Andreas Tille 
  8785 | Andreas Tille 
(3 Zeilen)

> (possibly combining it with the carnivore data to identify different emails
> belonging to the same person ?)

I could fiddle around with carnivore but that's overkill for thst
purpose and I insist that not stripping blanks from names does not make
any sense, IMHO.  (1 Zeile)


BTW:  I found 

udd=> SELECT count(*), name FROM (SELECT CASE WHEN changed_by_name = '' THEN 
maintainer_name ELSE changed_by_name END AS name FROM upload_history) uh WHERE 
name ilike '%tille%'  group by name;
 count | name  
---+---
 16524 | Andreas Tille
(1 Zeile)

So why do I have 8707 uploads per uploaders but 16524 per upload_history?

Is my assumption wrong that both values should match (modulo some wrongly
spelled names)

Kind regards
Andreas.

-- 
http://fam-tille.de



Re: Appstream data not in UTF-8?

2023-12-07 Thread Andreas Tille
Am Thu, Dec 07, 2023 at 11:03:29AM +0100 schrieb Raphael Hertzog:
> On Thu, 07 Dec 2023, Raphael Hertzog wrote:
> > tracker.debian.org has been failing to import the appstream metadata for a
> > while (since November 26th) with this exception:
> 
> Quick correction. The first time it failed that way was on November 19th
> at 08:13 UTC.

I do not remember the time exactly but my gut feeling says this is pretty
close to the bookworm upgrade.

Kind regards
Andreas. 

-- 
http://fam-tille.de



UDD contains names where spaces are not stripped

2023-12-07 Thread Andreas Tille
Hi,

by chance I realised that the uploaders table contains some names where names
are not stripped:

udd=> select '"' || u.name || '"' as name_with_spaces, uploader from uploaders 
u where name like '% ' or name like ' %' ;
 name_with_spaces | uploader  
--+---
 " Mehdi Dogguy"  |  Mehdi Dogguy 
 " David Paleino" |  David Paleino 
 " Stéphane Glondu"  |  Stéphane Glondu 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Stefano Zacchiroli"    |  Stefano Zacchiroli 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 "Andreas Tille  "| Andreas Tille   
 " LI Daobing"|  LI Daobing 
 " David Paleino" |  David Paleino 
 " Stefano Zacchiroli"|  Stefano Zacchiroli 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 " Nikita V. Youshchenko" |  Nikita V. Youshchenko 
 "Colin Tuckley " | Colin Tuckley  
 "Colin Tuckley " | Colin Tuckley  
 "Colin Tuckley " | Colin Tuckley  
(20 rows)


This causes slight errors when counting uploads of people.  My guess is this
is due to some old importer code (I've checked the hit for my name which
is a pretty old upload).  Thus I wonder whether it might be the easiest
fix to simply fix this with some proper UPDATE statement to remove unneeded
spaces.  This statement is doing the trick in my local clone:

   UPDATE uploaders SET name = trim(name), uploader = trim(name) || ' ' || 
email WHERE name like ' %' or name like '% ' ;

If I'm not misleaded historic uploads will not importet from scratch so
this would cure the situation.  Otherwise users need to always remember
adding some trim(name) when dealing with the uploaders.name column not
to mention that it gets even harder to deal with the uploader column
that might feature extra spaces in the middle.

What do you think?

Kind regards
Andreas.

-- 
http://fam-tille.de



Bug#1055269: udd: bugs.cgi does not show bugs for source packages in non-free-firmware

2023-11-04 Thread Andreas Tille
Just a quick note: I can't care for this in the next 48 hours.

Am Fri, Nov 03, 2023 at 10:48:11PM +0100 schrieb Cyril Brulebois:
> Lucas Nussbaum  (2023-11-03):
> > UDD uses several independant "importers". The constraint you quoted is
> > in the "blends" importer (maintained by Andreas Tille, cced).
> 
> ACK, I spotted a number of things that were blends-related, didn't
> realize that particular schema was too.
> 
> > The reason why UDD thinks that #1055136 does not affect unstable, is
> > because the BTS thinks it does not affect unstable. If you look at the
> > version graph for the bug, you see that the BTS only knows about the
> > version in oldstable, not about the versions in stable/testing/unstable.
> > The same happens for other packages in non-free-firmware (see #1038610
> > for example).
> 
> https://github.com/dondelelcaro/debbugs/issues/2 then.
> 
> 
> Cheers,
> -- 
> Cyril Brulebois (k...@debian.org)<https://debamax.com/>
> D-I release manager -- Release team member -- Freelance Consultant



-- 
http://fam-tille.de



Re: Bug#1032587: UDD's upstream_metadata table may contain stale data?

2023-03-15 Thread Andreas Tille
Am Tue, Mar 14, 2023 at 10:05:33PM +0100 schrieb Andreas Tille:
> Am Tue, Mar 14, 2023 at 10:42:30PM +0200 schrieb Faidon Liambotis:
> > Thanks Andreas! Is the code and/or logs for this cronjob somewhere I can
> > access myself? Perhaps I could have a look myself and help you out?
> 
> Its
> 
>
> https://salsa.debian.org/blends-team/website/-/blob/master/misc/machine_readable/fetch-machine-readable_salsa.py
> 
> but I think this short term issue is not worth that you are looking into
> it.  It should run on blends.debian.net since there are most of the
> watched projects cached.  For some reason the job seems to fail for
> 
>https://salsa.debian.org/python-team/packages/kazam
> 
> but I need to sort out whether this suspicion is true

Suspicion is wrong and I'm now convinced that we are trapped by some
means by Salsa to prevent DOS attacks.  That's why I increased the time
span between fetching data from Salsa (we have time, just need to be
ready in less than one day) and added more exceptions[1] to hopefully
fetch things like these:

> raise RemoteDisconnected("Remote end closed connection without"
> http.client.RemoteDisconnected: Remote end closed connection without response
> ...
> raise RemoteDisconnected("Remote end closed connection without"
> urllib3.exceptions.ProtocolError: ('Connection aborted.', 
> RemoteDisconnected('Remote end closed connection without response'))
> ...
> raise ConnectionError(err, request=request)
> requests.exceptions.ConnectionError: ('Connection aborted.', 
> RemoteDisconnected('Remote end closed connection without response'))

It seems I should also make sure cron will sent me some mail on
failure of this job since it was not working for quite some time.

BTW, it might be that I could need help by convincing Salsa admins that
it makes sense to parse these machine readable files right on Salsa.  I
started in times of Alioth with a job that was reading repositories
directly which was way less network consuming.  Since Salsa this is
not possible any more.  I tried really hard to cache the results and
reduce the network traffic as low as possible (just downloading single
files only).  However, as we see this is not reliable any more (which
it was for a couple of years).

Kind regards
   Andreas.

[1] 
https://salsa.debian.org/blends-team/website/-/commit/a51cf1cfeadc4693aaacfb5e74113805a286ebe1

-- 
http://fam-tille.de



Bug#1032587: UDD's upstream_metadata table may contain stale data?

2023-03-14 Thread Andreas Tille
Am Tue, Mar 14, 2023 at 10:42:30PM +0200 schrieb Faidon Liambotis:
> Thanks Andreas! Is the code and/or logs for this cronjob somewhere I can
> access myself? Perhaps I could have a look myself and help you out?

Its

   
https://salsa.debian.org/blends-team/website/-/blob/master/misc/machine_readable/fetch-machine-readable_salsa.py

but I think this short term issue is not worth that you are looking into
it.  It should run on blends.debian.net since there are most of the
watched projects cached.  For some reason the job seems to fail for

   https://salsa.debian.org/python-team/packages/kazam

but I need to sort out whether this suspicion is true and if yes why
there is this error:


Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 699, in 
urlopen
httplib_response = self._make_request(
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 445, in 
_make_request
six.raise_from(e, None)
  File "", line 3, in raise_from
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 440, in 
_make_request
httplib_response = conn.getresponse()
  File "/usr/lib/python3.9/http/client.py", line 1347, in getresponse
response.begin()
  File "/usr/lib/python3.9/http/client.py", line 307, in begin
version, status, reason = self._read_status()
  File "/usr/lib/python3.9/http/client.py", line 276, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 755, in 
urlopen
retries = retries.increment(
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 532, in 
increment
raise six.reraise(type(error), error, _stacktrace)
  File "/usr/lib/python3/dist-packages/six.py", line 718, in reraise
raise value.with_traceback(tb)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 699, in 
urlopen
httplib_response = self._make_request(
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 445, in 
_make_request
six.raise_from(e, None)
  File "", line 3, in raise_from
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 440, in 
_make_request
httplib_response = conn.getresponse()
  File "/usr/lib/python3.9/http/client.py", line 1347, in getresponse
response.begin()
  File "/usr/lib/python3.9/http/client.py", line 307, in begin
version, status, reason = self._read_status()
  File "/usr/lib/python3.9/http/client.py", line 276, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', 
RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File 
"/srv/blends.debian.org/misc/machine_readable/fetch-machine-readable_salsa.py", 
line 107, in output_metadata
items = project.repository_tree(path=subdir, recursive=False)
  File "/usr/lib/python3/dist-packages/gitlab/cli.py", line 42, in wrapped_f
return f(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/gitlab/exceptions.py", line 279, in 
wrapped_f
return f(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/gitlab/v4/objects/__init__.py", line 
4663, in repository_tree
return self.manager.gitlab.http_list(gl_path, query_data=query_data, 
**kwargs)
  File "/usr/lib/python3/dist-packages/gitlab/__init__.py", line 646, in 
http_list
return list(GitlabList(self, url, query_data, get_next=False, **kwargs))
  File "/usr/lib/python3/dist-packages/gitlab/__init__.py", line 777, in 
__init__
self._query(url, query_data, **self._kwargs)
  File "/usr/lib/python3/dist-packages/gitlab/__init__.py", line 782, in _query
result = self._gl.http_request("get", url, query_data=query_data, **kwargs)
  File "/usr/lib/python3/dist-packages/gitlab/__init__.py", line 531, in 
http_request
result = self.session.send(prepped, timeout=timeout, **settings)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 498, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', 
RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File 
"/srv/blends.debian.org/misc/machine_readable/fetch-machine-readable_salsa.py", 
line 283, in 
if output_metadata(pr, 'debian', debianmetadata):
  

Bug#1032587: UDD's upstream_metadata table may contain stale data?

2023-03-14 Thread Andreas Tille
Hi Faidon,

Am Thu, Mar 09, 2023 at 04:45:04PM +0200 schrieb Faidon Liambotis:
> This is my first attempt to query this data, so I hope this isn't an
> operator error!

No.

Thanks a lot for your bug report.  I realised some cron job gathering
upstream metadata files (and other machine readable files) is crashing
at some point in time.  I need to check this and can't promise anything
but its really good you filed this bug report since the crash seems to
happen later than those packages I'm watching closely and thus I did
not noticed.

Kind regards
   Andreas.

-- 
http://fam-tille.de



Public UDD mirror not updated

2022-11-29 Thread Andreas Tille
Hi,

when logging into public UDD mirror and doing a simple example query

udd=> select source, version, release from sources where 
source='r-cran-blockmodeling';
source| version | release  
--+-+--
 r-cran-blockmodeling | 0.1.8-1 | stretch
 r-cran-blockmodeling | 0.3.4-1 | buster
 r-cran-blockmodeling | 1.0.0-1 | bullseye
 r-cran-blockmodeling | 1.1.3-1 | bookworm
 r-cran-blockmodeling | 1.1.3-1 | sid

I realise that it is lagging behind what UDD has.  If you look at

   https://tracker.debian.org/pkg/r-cran-blockmodeling

the package even migrated to testing with version 1.1.4 so I guess
the public mirror was not updated the last 2-3 days.

Could you please have a look?

Thanks a ton anyway for thie great service

  Andreas.

-- 
http://fam-tille.de



UDD upstream importer seems to fail randomly for sf.net

2022-01-25 Thread Andreas Tille
Hi,

when checking UDD dashboard for the Debian Med team[1] I see lots of

   debian/watch: uscan returned an error: In debian/watch no matching files for 
watch line http://sf.net/

expressions.  I suspect that this affects all packages hosted at
SourceForge.  When running uscan manually on my local machine uscan
works nicely for at least five examples I've checked.  It seems there
are circumstances when the UDD importer just fails while the watch file
is perfectly OK?

I checked UDD (mirror) for success and failure when scanning
http://sf.net/ watch files:

udd=> SELECT sf_success, count(*) from (select source, case when warnings like 
'In debian/watch no matching files for watch line%' THEN 'Fail' ELSE 'Success' 
END as sf_success from upstream where watch_file like '%http://sf.net/%') as 
tmp group by sf_success ;
 sf_success | count 
+---
 Success|   731
 Fail   |   544

I wanted to check the packages of Debian Med team and got:

udd=> select source, watch_file, case when warnings like 'In debian/watch no 
matching files for watch line%' THEN 'Fail' ELSE 'Success' END as sf_success 
from upstream where watch_file like '%http://sf.net/%' and source in (select 
distinct source from sources where maintainer_email = 
'debian-med-packag...@lists.alioth.debian.org' and release = 'sid') order by 
sf_success, source;
  source   |  
watch_file  | sf_success 
---+--+
 amide | # See uscan(1) for format  
 +| Fail
   |
 +| 
   | # Compulsory line, this is a version 3 file
 +| 
   | version=3  
 +| 
   |
 +| 
   | http://sf.net/amide/amide-(1.*)\.tgz   
 +| 
   |
  | 
 codonw| version=4  
 +| Fail
   | opts="uversionmangle=s/_/./g" \
 +| 
   | http://sf.net/codonw/CodonWSourceCode_([0-9_]+)\.tar\.gz   
 +| 
   |
  | 
 ctn   | version=4  
 +| Fail
   | opts=dversionmangle=s/([~\+]dfsg)// \  
 +| 
   |   
http://sf.net/mirctn/mirctn-(\d[\d\.]+)\.(?:tgz|tbz|txz|(?:tar\.(?:gz|bz2|xz))) 
  +| 
   |
  | 
 emboss-explorer   | version=4  
 +| Fail
   | http://sf.net/embossgui/emboss-explorer-(.*)\.tar\.gz  
 +| 
   |
  | 
 fis-gtm   | version=4  
 +| Fail
   | http://sf.net/fis-gtm/fis-gtm-V(\d\.\d-\d+[A-F]*)\.tar\.gz 
 +| 
   |
  | 
 fsa   | version=4  
 +| Fail
   | opts="repacksuffix=+dfsg,dversionmangle=s/\+dfsg//g" \ 
 +| 
   |   

Re: Refresh thread (Was: Started porting UDD to Python3 (Was: [UDD] Is there some effort to port UDD to Python3?))

2022-01-24 Thread Andreas Tille
Hi again,

I've added this as an issue to "Grow your ideas for Debian Project":

   https://salsa.debian.org/debian/grow-your-ideas/-/issues/13

Kind regards

  Andreas.

Am Sun, Jan 16, 2022 at 04:28:29PM +0100 schrieb Andreas Tille:
> Hi,
> 
> as far as I can see we can not upgrade the machine running UDD
> to current stable since it is not ported to Python3 yet.  I wonder
> whether the authors involved into this project want to grab their
> code and port it to Python3 (inside the python3 branch of the
> repository).
> 
> Kind regards
> 
>Andreas.
> 
> Am Mon, May 18, 2020 at 09:57:33PM +0200 schrieb Andreas Tille:
> > On Mon, May 18, 2020 at 08:35:33PM +0200, Stéphane Blondon wrote:
> > > 
> > > Can you send me the file 'gatherer.${I_dont_know_the_command}' which
> > > raises the UnicodeDecodeError exception? I will try to write a working
> > > patch.
> > 
> > I simply added a debug line:
> > 
> > udd(python3) $ git diff
> > diff --git a/udd/ddtp_gatherer.py b/udd/ddtp_gatherer.py
> > index bbf041b..d32b85f 100644
> > --- a/udd/ddtp_gatherer.py
> > +++ b/udd/ddtp_gatherer.py
> > @@ -239,6 +239,7 @@ class ddtp_gatherer(gatherer):
> >self.log.exception("Error reading %s%s", dir, filename)
> >  
> >  def _open_file(path):
> > +print(path)
> >  with open(path, 'rb') as f:
> >  raw_content = f.read()
> >  encoding = chardet.detect(raw_content)["encoding"]
> > 
> > 
> > which leads to
> > 
> > 
> > udd(python3) $ ./update-and-run.sh ddtp
> > /srv/mirrors/debian/dists/squeeze-proposed-updates/main/i18n/Translation-en.bz2
> > /srv/mirrors/debian/dists/squeeze-proposed-updates/non-free/i18n/Translation-en.bz2
> > /srv/mirrors/debian/dists/squeeze-proposed-updates/contrib/i18n/Translation-en.bz2
> > /srv/mirrors/debian/dists/stretch-proposed-updates/main/i18n/Translation-en.bz2
> > Traceback (most recent call last):
> >   File "/srv/udd.debian.org/udd//udd.py", line 88, in 
> > exec("gatherer.%s()" % command)
> >   File "", line 1, in 
> >   File "/srv/udd.debian.org/udd/udd/ddtp_gatherer.py", line 127, in run
> > h.update(f.read())
> >   File "/usr/lib/python3.8/codecs.py", line 322, in decode
> > (result, consumed) = self._buffer_decode(data, self.errors, final)
> > UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc5 in position 11: 
> > invalid continuation byte
> > 
> > 
> > While you can download the files from any Debian mirror I've attached
> >
> > /srv/mirrors/debian/dists/stretch-proposed-updates/main/i18n/Translation-en.bz2
> > to this mail.  My guess is that translations from stretch will not be
> > touched any more and thus we need to cope somehow with the existing
> > encoding.
> > 
> > Thanks a lot for your help
> > 
> > Andreas.
> > 
> > -- 
> > http://fam-tille.de
> 
> 
> 
> -- 
> http://fam-tille.de
> 
> 

-- 
http://fam-tille.de



Refresh thread (Was: Started porting UDD to Python3 (Was: [UDD] Is there some effort to port UDD to Python3?))

2022-01-16 Thread Andreas Tille
Hi,

as far as I can see we can not upgrade the machine running UDD
to current stable since it is not ported to Python3 yet.  I wonder
whether the authors involved into this project want to grab their
code and port it to Python3 (inside the python3 branch of the
repository).

Kind regards

   Andreas.

Am Mon, May 18, 2020 at 09:57:33PM +0200 schrieb Andreas Tille:
> On Mon, May 18, 2020 at 08:35:33PM +0200, Stéphane Blondon wrote:
> > 
> > Can you send me the file 'gatherer.${I_dont_know_the_command}' which
> > raises the UnicodeDecodeError exception? I will try to write a working
> > patch.
> 
> I simply added a debug line:
> 
> udd(python3) $ git diff
> diff --git a/udd/ddtp_gatherer.py b/udd/ddtp_gatherer.py
> index bbf041b..d32b85f 100644
> --- a/udd/ddtp_gatherer.py
> +++ b/udd/ddtp_gatherer.py
> @@ -239,6 +239,7 @@ class ddtp_gatherer(gatherer):
>self.log.exception("Error reading %s%s", dir, filename)
>  
>  def _open_file(path):
> +print(path)
>  with open(path, 'rb') as f:
>  raw_content = f.read()
>  encoding = chardet.detect(raw_content)["encoding"]
> 
> 
> which leads to
> 
> 
> udd(python3) $ ./update-and-run.sh ddtp
> /srv/mirrors/debian/dists/squeeze-proposed-updates/main/i18n/Translation-en.bz2
> /srv/mirrors/debian/dists/squeeze-proposed-updates/non-free/i18n/Translation-en.bz2
> /srv/mirrors/debian/dists/squeeze-proposed-updates/contrib/i18n/Translation-en.bz2
> /srv/mirrors/debian/dists/stretch-proposed-updates/main/i18n/Translation-en.bz2
> Traceback (most recent call last):
>   File "/srv/udd.debian.org/udd//udd.py", line 88, in 
> exec("gatherer.%s()" % command)
>   File "", line 1, in 
>   File "/srv/udd.debian.org/udd/udd/ddtp_gatherer.py", line 127, in run
> h.update(f.read())
>   File "/usr/lib/python3.8/codecs.py", line 322, in decode
> (result, consumed) = self._buffer_decode(data, self.errors, final)
> UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc5 in position 11: 
> invalid continuation byte
> 
> 
> While you can download the files from any Debian mirror I've attached
>
> /srv/mirrors/debian/dists/stretch-proposed-updates/main/i18n/Translation-en.bz2
> to this mail.  My guess is that translations from stretch will not be
> touched any more and thus we need to cope somehow with the existing
> encoding.
> 
> Thanks a lot for your help
> 
> Andreas.
> 
> -- 
> http://fam-tille.de



-- 
http://fam-tille.de



Public UDD mirror offline? (Was: Public UDD mirror out of sync again)

2021-12-22 Thread Andreas Tille
Hi,

is the public mirror offline?

$ ping udd-mirror.debian.net
PING udd-mirror.debian.net (147.75.35.146) 56(84) bytes of data.
>From node12.net.fosshost.org (139.178.85.99) icmp_seq=1 Destination Host 
>Unreachable
>From node12.net.fosshost.org (139.178.85.99) icmp_seq=2 Destination Host 
>Unreachable

The cron job which I'm using to access the public mirror was running
on 2021-12-19 successfully last time.

Kind regards

Andreas.

-- 
http://fam-tille.de



Re: Public UDD mirror out of sync again

2021-12-14 Thread Andreas Tille
Am Tue, Dec 14, 2021 at 04:06:02PM +0100 schrieb Mattia Rizzolo:
> Hopefully at the very least reduced the probability to happen in the
> future with
> https://github.com/paulproteus/public-udd-mirror/commit/15df7d89fb6d447e0e4b57c3e3f24aa512414b64

Makes sense.
 
> Thanks for reporting!

Thanks a lot for providing the mirror

   Andreas.

-- 
http://fam-tille.de



Public UDD mirror out of sync again

2021-12-13 Thread Andreas Tille
Hi,

seems there is another case that public UDD mirror is out of sync.

UDD

udd=# select source, version, upstream_version from upstream where source like 
'r-cran-uwot' ;
   source| version  | upstream_version 
-+--+--
 r-cran-uwot | 0.1.11-1 | 0.1.11


Mirror:

udd=> select source, version, upstream_version from upstream where source like 
'r-cran-uwot' ;
   source| version  | upstream_version 
-+--+--
 r-cran-uwot | 0.1.10-1 | 0.1.11


(just to name an example of a package I've uploaded recently).

It would be great if this could be fixed (and prevented in future).

Thanks in any case for maintaining UDD mirror

  Andreas.


-- 
http://fam-tille.de



Re: [udd] Blends query runs into: terminating connection due to administrator command

2021-12-02 Thread Andreas Tille
Ping,
we just discussed in our Debian Med video conference whether
there is some chance to throw money against upgrading the
hardware of the machine running UDD.  What do you think about
this?
Kind regards
Andreas.

Am Fri, Nov 26, 2021 at 08:22:06AM +0100 schrieb Andreas Tille:
> Am Mon, Nov 22, 2021 at 01:28:24PM +0100 schrieb Andreas Tille:
> > Hi Lucas,
> > 
> > Am Mon, Nov 22, 2021 at 11:30:08AM +0100 schrieb Lucas Nussbaum:
> > > Hi,
> > > 
> > > > Is there any chance to bump the performance of the database server
> > > > to deal with this kind of queries?  Any other idea how to solve the
> > > > issue would be really welcome.
> > > > 
> > > > I'm observing this issue since about one week - not really sure about
> > > > the exact time.
> > > 
> > > Can you clarify if it's about UDD or the mirror?
> > 
> > It is against UDD.
> 
> Any news about this?
> 
> Kind regards
>Andreas. 
> 
> -- 
> http://fam-tille.de
> 
> 

-- 
http://fam-tille.de



Re: [udd] Blends query runs into: terminating connection due to administrator command

2021-11-25 Thread Andreas Tille
Am Mon, Nov 22, 2021 at 01:28:24PM +0100 schrieb Andreas Tille:
> Hi Lucas,
> 
> Am Mon, Nov 22, 2021 at 11:30:08AM +0100 schrieb Lucas Nussbaum:
> > Hi,
> > 
> > > Is there any chance to bump the performance of the database server
> > > to deal with this kind of queries?  Any other idea how to solve the
> > > issue would be really welcome.
> > > 
> > > I'm observing this issue since about one week - not really sure about
> > > the exact time.
> > 
> > Can you clarify if it's about UDD or the mirror?
> 
> It is against UDD.

Any news about this?

Kind regards
   Andreas. 

-- 
http://fam-tille.de



Re: [udd] Blends query runs into: terminating connection due to administrator command

2021-11-22 Thread Andreas Tille
Hi Lucas,

Am Mon, Nov 22, 2021 at 11:30:08AM +0100 schrieb Lucas Nussbaum:
> Hi,
> 
> > Is there any chance to bump the performance of the database server
> > to deal with this kind of queries?  Any other idea how to solve the
> > issue would be really welcome.
> > 
> > I'm observing this issue since about one week - not really sure about
> > the exact time.
> 
> Can you clarify if it's about UDD or the mirror?

It is against UDD.

Kind regards

  Andreas.

-- 
http://fam-tille.de



[udd] Blends query runs into: terminating connection due to administrator command

2021-11-22 Thread Andreas Tille
Hi,

since some time the Debian Med packages are not updated any more.  The
output of the job can be found in the logfile[1]:


Traceback (most recent call last):
  File "./tasks.py", line 28, in 
tasks.GetAllDependencies()
  File "/srv/blends.debian.org/webtools/blendstasktools.py", line 937, in 
GetAllDependencies
if td.GetTaskDependencies(source):
  File "/srv/blends.debian.org/webtools/blendstasktools.py", line 1380, in 
GetTaskDependencies
_execute_udd_query(query)
  File "/srv/blends.debian.org/webtools/blendstasktools.py", line 307, in 
_execute_udd_query
curs.execute(query)
psycopg2.OperationalError: terminating connection due to administrator command
CONTEXT:  SQL function "blends_query_packages" statement 1
SSL connection has been closed unexpectedly


I admit the query is a bit complex and deals with a couple of data.
Is there any chance to bump the performance of the database server
to deal with this kind of queries?  Any other idea how to solve the
issue would be really welcome.

I'm observing this issue since about one week - not really sure about
the exact time.

Kind regards

 Andreas.


[1] https://blends.debian.org/_logs/debian-med.err

-- 
http://fam-tille.de



Re: Public UDD mirror again not up to date

2021-11-16 Thread Andreas Tille
Hi Mattia,

Am Tue, Nov 16, 2021 at 09:30:43PM +0100 schrieb Mattia Rizzolo:
> Yes, this was caused by a network error that left a stale lockfile:
> 
> Log started at Fri Nov 12 22:05:01 UTC 2021
> lock taken at /tmp/update_udd.udd-mirror.lock
> Downloading udd.dump
> wget: unable to resolve host address 'udd.debian.org'
> 
> I removed the lockfile, so it should update at the next cron run.

Thanks a lot for keeping the public mirror up and running.  The fact
that we seem to notice quite early might show you that your service
is regularly and frequently used. ;-)
 
> (I'll try to keep a mental note to improve this handling, though this
> kind of error is empirically so rare that I'll likely forget)

I admit this kind of issues are rare enough that I would not really
ask you for this.

Thanks again

  Andreas.

-- 
http://fam-tille.de



Public UDD mirror again not up to date

2021-11-15 Thread Andreas Tille
Hi,

it seems that the public UDD mirror has again issues to update its data:

UDD:

 source  | version | distribution | release | component |   
  watch_file  | signing_key_pgp | 
signing_key_asc | debian_uversion | debian_mangled_uversion | upstream_version 
| upstream_url | errors | 
warnings |   status   | last_check 
-+-+--+-+---+-+-+-+-+-+--+--++--++
 r-cran-corrplot | 0.91-1  | debian   | sid | main  | version=4 
 +| |   
  | 0.91| 0.91| 0.91 | 
https://cloud.r-project.org/src/contrib/corrplot_0.91.tar.gz || 
 | up to date | 2021-11-14 03:50:11.801091
 | |  | |   | 
https://cloud.r-project.org/src/contrib/corrplot_([-\d.]*)\.tar\.gz+|   
  | | | |   
   |  |
|  || 
 | |  | |   |   
  | |   
  | | |  |  
||  |   
 | 


Public mirror:

 source  | version | distribution | release | component |   
  watch_file  | signing_key_pgp | 
signing_key_asc | debian_uversion | debian_mangled_uversion | upstream_version 
| upstream_url | errors | 
warnings | status  | last_check 
-+-+--+-+---+-+-+-+-+-+--+--++--+-+
 r-cran-corrplot | 0.90-1  | debian   | sid | main  | version=4 
 +| |   
  | 0.90| 0.90| 0.91 | 
https://cloud.r-project.org/src/contrib/corrplot_0.91.tar.gz || 
 | newer package available | 2021-11-12 01:21:09.843327
 | |  | |   | 
https://cloud.r-project.org/src/contrib/corrplot_([-\d.]*)\.tar\.gz+|   
  | | | |   
   |  |
|  | | 
 | |  | |   |   
  | |   
  | | |  |  
||  |   
  | 


Could you please have a look?

Kind regards and thanks for maintaining that mirror

Andreas.

-- 
http://fam-tille.de



Re: UDD dump seems to be outdated (Was: Why is the list of outdated packages not getting updated?)

2021-10-13 Thread Andreas Tille
Am Wed, Oct 13, 2021 at 02:42:01PM +0200 schrieb Mattia Rizzolo:
> Indeed udd-mirror had stopped taking in updates due to a leaked
> lockfile.
> 
> It should be up2date now.

Thanks a lot

Andreas.

-- 
http://fam-tille.de



Re: Bug#995616: dh-make: create debian/upstream/medatada template

2021-10-12 Thread Andreas Tille
Am Wed, Oct 13, 2021 at 07:37:33AM +1100 schrieb Craig Small:
>   It was more that the main system [1] that used this data seemed to be
> offline as well as the debian med one [2] too.
>
> 2: http://debian-med.alioth.debian.org/tasks

Even when alioth as online the page you should refer to is

   https://blends.debian.org/med/tasks/
 
Where did you found those outdated link to alioth (this should have
been fixed a long time ago - sorry if anybody missed this).

Kind regards

 Andreas.

-- 
http://fam-tille.de



Re: UDD dump seems to be outdated (Was: Why is the list of outdated packages not getting updated?)

2021-10-12 Thread Andreas Tille
Hi again,

I just checked and the public UDD mirror keeps on presenting outdated
results as I wrote below.  Can some of the admins have a look?

Thanks a lot

  Andreas.

Am Tue, Oct 12, 2021 at 04:42:21PM +0200 schrieb Andreas Tille:
> Hi,
> 
> Am Tue, Oct 12, 2021 at 02:43:56PM +0200 schrieb Lucas Nussbaum:
> > > udd=> select source , version, debian_mangled_uversion, upstream_version, 
> > > status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
> > > source| version | debian_mangled_uversion | upstream_version |   
> > > status   
> > > --+-+-+--+
> > >  r-cran-rwave | 2.6-0-1 | 2.6-0   | 2.6-0| up 
> > > to date
> > >  r-cran-s2| 1.0.7-3 | 1.0.7   | 1.0.7| up 
> > > to date
> > > (2 rows)
> > > 
> > > my local import has
> > > 
> > > udd=# select source , version, debian_mangled_uversion, upstream_version, 
> > > status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
> > > source| version | debian_mangled_uversion | upstream_version |   
> > > status   
> > > --+-+-+--+
> > >  r-cran-rwave | 2.5-0-1 | 2.5-0   | 2.5-0| up 
> > > to date
> > >  r-cran-s2| 1.0.6-1 | 1.0.6   | 1.0.6| up 
> > > to date
> > > (2 rows)
> > > 
> > > 
> > > Could someone please check the export procedure of the dump?
> > > This probably also affects the public udd mirror (even if I need
> > > to admit that I can not reach this currently).
> > 
> > lucas@ullmann:~$ ls -l /srv/udd.debian.org/udd/web/dumps/udd.dump
> > -rw-r--r-- 1 udd uddadm 1923508046 Oct 12 06:54 
> > /srv/udd.debian.org/udd/web/dumps/udd.dump
> > 
> > that looks fine?
> 
> Please note: I did not questioned that the dump had the correct timestamp but
> incorrect *content*.  However, after downloading this dump I correctly get:
> 
> udd=# select source , version, debian_mangled_uversion, upstream_version, 
> status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
> source| version | debian_mangled_uversion | upstream_version |   
> status   
> --+-+-+--+
>  r-cran-rwave | 2.6-0-1 | 2.6-0   | 2.6-0| up to 
> date
>  r-cran-s2| 1.0.7-3 | 1.0.7   | 1.0.7| up to 
> date
> (2 rows)
> 
> 
> Fine.  However, the public UDD mirror has not synced yet:
> 
> 
> $ psql --port=5432 --host=udd-mirror.debian.net --username=udd-mirror udd
> ...
> udd=> select source , version, debian_mangled_uversion, upstream_version, 
> status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
> source| version | debian_mangled_uversion | upstream_version |
>  status  
> --+-+-+--+-
>  r-cran-rwave | 2.5-0-1 | 2.5-0   | 2.6-0| newer 
> package available
>  r-cran-s2| 1.0.6-1 | 1.0.6   | 1.0.7| newer 
> package available
> 
> 
> I just hope this will settle with the next update of the public mirror.
> 
> Kind regards
> 
>  Andreas.
> 
> -- 
> http://fam-tille.de
> 
> 

-- 
http://fam-tille.de



Re: UDD dump seems to be outdated (Was: Why is the list of outdated packages not getting updated?)

2021-10-12 Thread Andreas Tille
Hi,

Am Tue, Oct 12, 2021 at 02:43:56PM +0200 schrieb Lucas Nussbaum:
> > udd=> select source , version, debian_mangled_uversion, upstream_version, 
> > status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
> > source| version | debian_mangled_uversion | upstream_version |   
> > status   
> > --+-+-+--+
> >  r-cran-rwave | 2.6-0-1 | 2.6-0   | 2.6-0| up 
> > to date
> >  r-cran-s2| 1.0.7-3 | 1.0.7   | 1.0.7| up 
> > to date
> > (2 rows)
> > 
> > my local import has
> > 
> > udd=# select source , version, debian_mangled_uversion, upstream_version, 
> > status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
> > source| version | debian_mangled_uversion | upstream_version |   
> > status   
> > --+-+-+--+
> >  r-cran-rwave | 2.5-0-1 | 2.5-0   | 2.5-0| up 
> > to date
> >  r-cran-s2| 1.0.6-1 | 1.0.6   | 1.0.6| up 
> > to date
> > (2 rows)
> > 
> > 
> > Could someone please check the export procedure of the dump?
> > This probably also affects the public udd mirror (even if I need
> > to admit that I can not reach this currently).
> 
> lucas@ullmann:~$ ls -l /srv/udd.debian.org/udd/web/dumps/udd.dump
> -rw-r--r-- 1 udd uddadm 1923508046 Oct 12 06:54 
> /srv/udd.debian.org/udd/web/dumps/udd.dump
> 
> that looks fine?

Please note: I did not questioned that the dump had the correct timestamp but
incorrect *content*.  However, after downloading this dump I correctly get:

udd=# select source , version, debian_mangled_uversion, upstream_version, 
status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
source| version | debian_mangled_uversion | upstream_version |   status 
  
--+-+-+--+
 r-cran-rwave | 2.6-0-1 | 2.6-0   | 2.6-0| up to 
date
 r-cran-s2| 1.0.7-3 | 1.0.7   | 1.0.7| up to 
date
(2 rows)


Fine.  However, the public UDD mirror has not synced yet:


$ psql --port=5432 --host=udd-mirror.debian.net --username=udd-mirror udd
...
udd=> select source , version, debian_mangled_uversion, upstream_version, 
status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
source| version | debian_mangled_uversion | upstream_version | 
status  
--+-+-+--+-
 r-cran-rwave | 2.5-0-1 | 2.5-0   | 2.6-0| newer 
package available
 r-cran-s2| 1.0.6-1 | 1.0.6   | 1.0.7| newer 
package available


I just hope this will settle with the next update of the public mirror.

Kind regards

 Andreas.

-- 
http://fam-tille.de



UDD dump seems to be outdated (Was: Why is the list of outdated packages not getting updated?)

2021-10-11 Thread Andreas Tille
Hi,

I think the issue described below by Nilesh Patra might be caused by the
fact that the UDD dump[3] is not updated properly.  I just downloaded
the dump and get a modification time of

   Modify: 2021-10-11 02:55:01.0 +0200

and md5sum

   fb9c6cb3ddb3d819983f7c0d82a50669  udd.dump

While the UDD on ullmann has

udd=> select source , version, debian_mangled_uversion, upstream_version, 
status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
source| version | debian_mangled_uversion | upstream_version |   status 
  
--+-+-+--+
 r-cran-rwave | 2.6-0-1 | 2.6-0   | 2.6-0| up to 
date
 r-cran-s2| 1.0.7-3 | 1.0.7   | 1.0.7| up to 
date
(2 rows)

my local import has

udd=# select source , version, debian_mangled_uversion, upstream_version, 
status from upstream where source in ('r-cran-s2','r-cran-rwave') ;
source| version | debian_mangled_uversion | upstream_version |   status 
  
--+-+-+--+
 r-cran-rwave | 2.5-0-1 | 2.5-0   | 2.5-0| up to 
date
 r-cran-s2| 1.0.6-1 | 1.0.6   | 1.0.6| up to 
date
(2 rows)


Could someone please check the export procedure of the dump?
This probably also affects the public udd mirror (even if I need
to admit that I can not reach this currently).

Kind regards

 Andreas.

Am Thu, Oct 07, 2021 at 06:02:59PM +0530 schrieb Nilesh Patra:
> I had uploaded r-cran-rwave couple of days back, and it has even
> migrated to testing. r-cran-s2 is also updated for a while, but
> it shows the older version here[1]
> 
> Same goes for med-team packages, for example augur, bbmap insilicoseq
> so on.
> 
> Something wrong with UDD again?

 
> [1]: 
> https://salsa.debian.org/r-pkg-team/maintenance-utilities/-/blob/master/outdated_r-packages.txt
> [2]: 
> https://salsa.debian.org/med-team/community/helper-scripts/-/blob/master/outdated_med-packages.txt
> 
> Nilesh

[3] https://udd.debian.org/dumps/udd.dump

-- 
http://fam-tille.de



Re: upload_history not updated (Was: UDD mirror not up to date)

2021-08-22 Thread Andreas Tille
On Sun, Aug 22, 2021 at 09:42:21AM +0200, Mattia Rizzolo wrote:
> 
> It takes 2-3 minutes.  but when I mailed last time I had already run it
> once, indeed this is the last record right now:
> 
> udd=> select source, version, date, distribution from upload_history order by 
> date desc limit 1;
>  source  | version |  date  | distribution
> -+-++--
>  tellico | 3.4.1-2 | 2021-08-21 10:03:46+00 | unstable
> (1 row)

OK, r-cran-stringi is now at the right version as well.  Thanks a lot.
 
> IOW, it looks "good" to me.  I'm running it again now.  however of
> course it'll only pick the uploads until I run the command.  Clearly
> somebody needs to take out the time to look at what happened and re-cron
> it all.

re-croning would be helpful, thought.

Thanks a lot for your help

  Andreas.

-- 
http://fam-tille.de



Re: upload_history not updated (Was: UDD mirror not up to date)

2021-08-21 Thread Andreas Tille
Hi again,

On Sat, Aug 21, 2021 at 12:27:15PM +0200, Mattia Rizzolo wrote:
> > > Thanks a lot.  That's very convenient since we relay on UDD to organise
> > > the packages that need updates.
> > 
> > Seems that upload_history in UDD itself is not updated:
> 
> Yes, that's included in the importers that are stopped for now.
> 
> I'm running the "upload-history" importer manually right now, so you
> should see somethng appear soon, just for you :)

I have no idea how much time the importer takes but I havn't seen any
change so far.

Kind regards and thanks again for your effort

 Andreas.

-- 
http://fam-tille.de



Re: upload_history not updated (Was: UDD mirror not up to date)

2021-08-21 Thread Andreas Tille
On Sat, Aug 21, 2021 at 12:27:15PM +0200, Mattia Rizzolo wrote:
> 
> Yes, that's included in the importers that are stopped for now.
> 
> I'm running the "upload-history" importer manually right now, so you
> should see somethng appear soon, just for you :)

That's a really great service.  The rationaly why I'm so keen on these
data is that I want to proof that each new release has the effect of a
bug bump in the number of uploads - the "freeze depression" has ended
now.

Thanks a lot for your support

  Andreas.

-- 
http://fam-tille.de



upload_history not updated (Was: UDD mirror not up to date)

2021-08-20 Thread Andreas Tille
Hi again,

On Thu, Aug 19, 2021 at 03:04:22PM +0200, Andreas Tille wrote:
> On Thu, Aug 19, 2021 at 11:29:20AM +0200, Mattia Rizzolo wrote:
> > 
> > I just re-enabled the cronjobs handling the dumps, so that at least
> > udd-mirror will keep updated.
> 
> Thanks a lot.  That's very convenient since we relay on UDD to organise
> the packages that need updates.

Seems that upload_history in UDD itself is not updated:

udd=# select source, version, date from upload_history where source = 
'r-cran-stringi' order by version;
 source | version |  date  
+-+
 r-cran-stringi | 0.5-5-1 | 2015-11-12 11:00:22+01
 r-cran-stringi | 1.0-1-1 | 2015-11-12 11:21:49+01
 r-cran-stringi | 1.1.2-1 | 2016-11-04 16:08:36+01
 r-cran-stringi | 1.1.5-1 | 2017-10-12 22:51:57+02
 r-cran-stringi | 1.1.6-1 | 2018-02-23 10:04:47+01
 r-cran-stringi | 1.1.7-1 | 2018-03-16 09:53:23+01
 r-cran-stringi | 1.2.2-1 | 2018-05-03 09:58:35+02
 r-cran-stringi | 1.2.3-1 | 2018-06-16 22:36:59+02
 r-cran-stringi | 1.2.4-1 | 2018-07-26 16:55:31+02
 r-cran-stringi | 1.2.4-2 | 2018-11-15 16:05:27+01
 r-cran-stringi | 1.4.3-1 | 2019-07-09 09:19:10+02
 r-cran-stringi | 1.4.5-1 | 2020-01-13 21:40:36+01
 r-cran-stringi | 1.4.6-1 | 2020-02-22 09:36:51+01
 r-cran-stringi | 1.5.3-1 | 2020-09-22 12:03:34+02
(14 Zeilen)


$ rmadison r-cran-stringi
r-cran-stringi | 1.1.2-1| oldoldstable| source, amd64, 
arm64, armel, armhf, i386, mips, mips64el, mipsel, ppc64el, s390x
r-cran-stringi | 1.2.4-2~bpo9+1 | stretch-backports   | source, amd64, 
arm64, armel, armhf, i386, mips, mips64el, mipsel, ppc64el, s390x
r-cran-stringi | 1.2.4-2~bpo9+1 | stretch-backports-debug | source
r-cran-stringi | 1.2.4-2| oldstable   | source, amd64, 
arm64, armel, armhf, i386, mips, mips64el, mipsel, ppc64el, s390x
r-cran-stringi | 1.5.3-1| stable  | source, amd64, 
arm64, armel, armhf, i386, mips64el, mipsel, ppc64el, s390x
r-cran-stringi | 1.5.3-1| testing | source, amd64, 
arm64, armel, armhf, i386, mips64el, mipsel, ppc64el, s390x
r-cran-stringi | 1.7.3-1| unstable| source, amd64, 
arm64, armel, armhf, i386, mips64el, mipsel, ppc64el, s390x
r-cran-stringi | 1.7.3-1| unstable-debug  | source


I admit I would be really urgent to get this updated quickly.  I need
those data for my talk at DebConf. :-(

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: UDD mirror not up to date

2021-08-19 Thread Andreas Tille
On Thu, Aug 19, 2021 at 11:29:20AM +0200, Mattia Rizzolo wrote:
> 
> I just re-enabled the cronjobs handling the dumps, so that at least
> udd-mirror will keep updated.

Thanks a lot.  That's very convenient since we relay on UDD to organise
the packages that need updates.

Kind regards

   Andreas.

-- 
http://fam-tille.de



UDD mirror not up to date

2021-08-19 Thread Andreas Tille
Hi,

I'm just checking some R packages and realised that UDD mirror
seems not to be up to date

The mirror has:

$ psql --port=5432 --host=udd-mirror.debian.net --username=udd-mirror udd
udd=> select source, version, release from sources where source = 'r-cran-xslt' 
;
   source| version | release  
-+-+--
 r-cran-xslt | 1.3-1   | buster
 r-cran-xslt | 1.4.2-1 | bullseye
 r-cran-xslt | 1.4.2-1 | sid
(3 Zeilen)


original UDD has:

udd=>  select source, version, release from sources where source = 
'r-cran-xslt' ;
   source| version | release  
-+-+--
 r-cran-xslt | 1.3-1   | buster
 r-cran-xslt | 1.4.2-1 | bullseye
 r-cran-xslt | 1.4.3-1 | bookworm
 r-cran-xslt | 1.4.3-1 | sid
(4 Zeilen)


Is there some issue with syncronizing the mirror?

Kind regards

  Andreas.


-- 
http://fam-tille.de



Re: Access to UDD seems to have changed since 2021-06-01

2021-06-24 Thread Andreas Tille
Hi Mattia,

what channel did you used to enable me continuing the discussion?

Kind regards and thanks a lot for your investigation
   Andreas.

On Thu, Jun 24, 2021 at 05:28:27PM +0200, Mattia Rizzolo wrote:
> On Thu, Jun 17, 2021 at 10:39:53AM +0200, Andreas Tille wrote:
> > On Wed, Jun 16, 2021 at 03:29:45PM +0200, Mattia Rizzolo wrote:
> > > > the Blends web sentinel (running in dillon.d.o) can't connect to UDD any
> > > > more since 2021-06-01.  I'm using a Python script for the connection but
> > > > even the simple access via psql (described in Wiki[1]) fails:
> > > 
> > > This is definitely not something within UDD people's scope, but I
> > > forwarded it to DSA on IRC, thanks for reporting.
> > 
> > Thanks.  Are you able to reproduce the issue in some way (from another
> > host may be)?
> 
> The other hosts I usually access UDD from (quantz, master) seem to work
> just fine.
> 
> What DSA told me is that they don't think it ever worked:
> 
> [16 03:25:40 PM]  dillon% psql service=udd
> [16 03:25:40 PM]  psql: could not connect to server: Connection 
> refused
> [16 03:25:40 PM]   Is the server running on host 
> "ullmann.debian.org" (2607:f8f0:614:1::1274:38) and accepting
> [16 03:25:40 PM]   TCP/IP connections on port 5452?
> [16 03:25:40 PM]  could not connect to server: Connection refused
> [16 03:25:40 PM]   Is the server running on host 
> "ullmann.debian.org" (209.87.16.38) and accepting
> [16 03:25:40 PM]   TCP/IP connections on port 5452?
> [16 03:25:51 PM]  this used to work apparently :)
> [16 03:26:50 PM] <@jcristau> do you have any evidence for that claim?
> [16 03:27:42 PM]  well, it being mentioned in the host's 
> pg_service.conf should say that at least it ought to
> [16 03:28:25 PM]  plus yes, blends.d.n always used udd for things, 
> so I'm positive it used to work even without Andreas' claim on debian-qa@
> [16 03:29:10 PM]  
> https://lists.debian.org/debian-qa/2021/06/msg6.html
> [16 03:30:28 PM] <@jcristau> i can't find any trace of that having changed in 
> recent months/years
> [16 03:35:29 PM] <@adsb> even before it moved to managed hba the hard-coded 
> list doesn't appear to have contained dillon
> [16 03:35:55 PM] <@adsb> (hard-coded for ferm, that is)
> 
> So, could you maybe take it to them?  It's very likely ullmann's
> firewall not allowing dillon, which is their territory.
> 
> > > > $ LANG=C psql -U guest -h udd.debian.org -p 5452 udd
> > > 
> > > I recommend you use the "psql service=udd" syntax instead, so that you
> > > are more reliant in case something did change in the connection details.
> > 
> > Same result here:
> 
> Yeah, sure, I expect the same result.  I was just saying that it might
> nicer since it allows you to run the same line regardless of the host
> you are in (which might have different configuration needs). :)
> 
> -- 
> regards,
> Mattia Rizzolo
> 
> GPG Key: 66AE 2B4A FCCF 3F52 DA18  4D18 4B04 3FCD B944 4540  .''`.
> More about me:  https://mapreri.org : :'  :
> Launchpad user: https://launchpad.net/~mapreri  `. `'`
> Debian QA page: https://qa.debian.org/developer.php?login=mattia  `-



-- 
http://fam-tille.de



Re: Access to UDD seems to have changed since 2021-06-01

2021-06-17 Thread Andreas Tille
Hi Mattia,

On Wed, Jun 16, 2021 at 03:29:45PM +0200, Mattia Rizzolo wrote:
> > the Blends web sentinel (running in dillon.d.o) can't connect to UDD any
> > more since 2021-06-01.  I'm using a Python script for the connection but
> > even the simple access via psql (described in Wiki[1]) fails:
> 
> This is definitely not something within UDD people's scope, but I
> forwarded it to DSA on IRC, thanks for reporting.

Thanks.  Are you able to reproduce the issue in some way (from another
host may be)?
 
> > $ LANG=C psql -U guest -h udd.debian.org -p 5452 udd
> 
> I recommend you use the "psql service=udd" syntax instead, so that you
> are more reliant in case something did change in the connection details.

Same result here:

$ LANG=C psql service=udd
psql: could not connect to server: Connection refused
Is the server running on host "ullmann.debian.org" 
(2607:f8f0:614:1::1274:38) and accepting
TCP/IP connections on port 5452?
could not connect to server: Connection refused
Is the server running on host "ullmann.debian.org" (209.87.16.38) and 
accepting
TCP/IP connections on port 5452?

Kind regards

   Andreas.

-- 
http://fam-tille.de



Access to UDD seems to have changed since 2021-06-01

2021-06-16 Thread Andreas Tille
Hi,

the Blends web sentinel (running in dillon.d.o) can't connect to UDD any
more since 2021-06-01.  I'm using a Python script for the connection but
even the simple access via psql (described in Wiki[1]) fails:

$ LANG=C psql -U guest -h udd.debian.org -p 5452 udd
psql: could not connect to server: Connection refused
Is the server running on host "udd.debian.org" 
(2607:f8f0:614:1::1274:38) and accepting
TCP/IP connections on port 5452?
could not connect to server: Connection refused
Is the server running on host "udd.debian.org" (209.87.16.38) and 
accepting
TCP/IP connections on port 5452?

Did anything changed at beginning of June?  Is the Wiki outdated?

Kind regards

   Andreas.


[1] https://wiki.debian.org/UltimateDebianDatabase

-- 
http://fam-tille.de



Re: Debian Trends updated

2021-04-17 Thread Andreas Tille
Hi Lucas,

On Sat, Apr 17, 2021 at 09:36:10AM +0200, Lucas Nussbaum wrote:
> Trends is just based on what lintian reports, and in that case, lintian
> thinks that's the case, see https://lintian.debian.net/sources/probcons

Thanks for the clarification.
 
> It looks like this package ships both debian/copyright and
> debian/probcons.copyright. While debian/copyright is DEP5-compliant,
> debian/probcons.copyright isn't because of the first two lines:
> https://sources.debian.org/src/probcons/1.12-13/debian/probcons.copyright/

Argh, fixed in Git.

Good we talked about this

   Andreas.

-- 
http://fam-tille.de



Re: Debian Trends updated

2021-04-17 Thread Andreas Tille
Hi Lucas,

On Wed, Apr 07, 2021 at 02:03:47PM +0200, Lucas Nussbaum wrote:
> I just updated Debian Trends: https://trends.debian.net/

Thanks a lot for Debian Trends.  I have checked the code smells[1] for
I think this is a false positive:

probcons (U) does not use the machine-readable copyright format. 
(source version: 1.12-13)

since this version has a DEP5 copyright.  Am I missing something?

Thanks again for your work on this
   Andreas.

[1] https://trends.debian.net/packages-with-smells-sorted-by-maintainer.txt
[2] https://sources.debian.org/src/probcons/1.12-13/debian/copyright/

-- 
http://fam-tille.de



Bug#966649: Unfortunately there are several Uploads missing (Was: upload_history is back)

2020-08-26 Thread Andreas Tille
Control: reopen -1

Hi Asheesh,

On Tue, Aug 25, 2020 at 10:52:56PM -0700, Asheesh Laroia wrote:
> Test yourself with e.g. this command (which queries the public UDD mirror,
> but you can use the real UDD if you can connect to ullmann.debian.org)!

I tested my teammetrics statistics script[1] which created those data
file I attached in the tarball.  The text file is used as input for some
R-script to produce the graphs which are also inside the attached
tarball.

The tarball contains the result from end of may (later the uploaders
table was broken) and from today.  Unfortunately it seems a lot of
entried are missing.  Another data point for missing uploads in the
table is that if you check the bugs statistics of the Debian Med team[2]
it looks somehow "unrelated" to the uploads.  I've fixed in total 2057
bugs in only 1720 uploads sounds not sensible since I did lots of
uploads without fixing bugs and only a few uploads fixed more than one
bug.

So the bad news is that there is something wrong with the importer.

Thanks a lot for your work anyway

  Andreas.

[1] 
https://salsa.debian.org/teammetrics-team/teammetrics/-/blob/master/upload_history.py
[2] http://blends.debian.net/liststats/bugs_debian-med.png

-- 
http://fam-tille.de


uploaders_may-august.tar.xz
Description: application/xz


Bug#966649: Merge request for minimal, tested Python 3 port

2020-08-23 Thread Andreas Tille
Hi Asheesh,

thanks a lot.  I hope Lucas will merge and check.
I'm currently working on DebConf things.

Thanks again to you and Lucas

 Andreas.

On Sun, Aug 23, 2020 at 02:38:35AM -0700, Asheesh Laroia wrote:
> Hi all,
> 
> I submitted a merge request here with code that should work for a Python 3
> port: https://salsa.debian.org/qa/udd/-/merge_requests/26
> 
> It relies on the current approach: rsync'd historic mboxes & a
> .current mbox. It should operate identically to the bitrotted munge_ddc.py.
> Good advice, Lucas, on focusing on the existing code, and Andreas, thanks
> for helping me see it really is important to have this in UDD for your work.
> 
> Long-term I think that there are other changes I'd make, but I may as well
> focus on restoring service first, then other improvements second.
> 
> Happy for any feedback. Cheers.
> 
> Asheesh.

-- 
http://fam-tille.de



Bug#966649: Request for feedback on upload_history re-implementation

2020-08-22 Thread Andreas Tille
Hi Asheesh,

On Sat, Aug 22, 2020 at 11:21:55AM -0700, Asheesh Laroia wrote:
> You noticed that the date column was an integer. That's fixed now; if you
> update from git, and if you delete upload_history.sqlite on your machine,
> and re-run the tool, the upload_history column will use a datetime for the
> date column. It won't do many HTTP queries, so it's peaceful to do that.

Thanks a lot for fixing this.

> (In case you're curious, the integer was "epoch time", seconds since Jan 1
> 1970.)

I assumed this was the case. ;-)

> More info here:
> https://github.com/paulproteus/debian-devel-changes-history-extractor/issues/5
> 
> You noticed I'm storing message_id in the upload_history table. It's that I
> probably don't need that. I'll see if I can get rid of it. For now, my
> advice is to ignore it; I'll work on getting rid of it.

I'm fine with additional columns since these are not breaking existing
apps - provided these extra columns do not need a lot of disk space (as
in case of the full text of the changelog paragraph).
 
> You also noticed I'm storing the full changelog paragraph. I removed that a
> moment ago -- if you get a fresh copy from git, it should be gone. You're
> right that I was storing it for debugging reasons, and I don't need it in
> upload_history.

Great.  Thanks a lot.
 
> As for signed_by* -- working on it next.

Very cool.  I'll test soon.

The bad news is that I realised I'm using the uploaders table in way
more applications than I expected in the first place and it is joined to
several other tables in UDD.  So my final statement a simple sqlite
file is sufficient for the moment was a bit naive.  I have no idea
whether there is some sqlite2psql converter - but it would be really
great to have upload_history back in UDD in the near future.  But well,
the world will keep on turning round if you do not manage before
DebConf ends (the talk where I need the data for is at the last day
of DebConf).

Thanks a lot for your work in any case

  Andreas.

-- 
http://fam-tille.de



Bug#966649: Request for feedback on upload_history re-implementation

2020-08-20 Thread Andreas Tille
Hi Asheesh,

I'm currently testing your code from the git repository.  Interestingly
it also respects future ;-) :

...
Computed upload history for 2020-05
Computed upload history for 2020-06
Computed upload history for 2020-07
Computed upload history for 2020-08
Computed upload history for 2020-09
Computed upload history for 2020-10
Computed upload history for 2020-11
Computed upload history for 2020-12
Computed upload history for 2021-01
Computed upload history for 2021-02
Computed upload history for 2021-03
Computed upload history for 2021-04
Computed upload history for 2021-05
Computed upload history for 2021-06
Computed upload history for 2021-07
Computed upload history for 2021-08
Computed upload history for 2021-09
Computed upload history for 2021-10
Computed upload history for 2021-11
Computed upload history for 2021-12


>From the first look the result looks sensible:

sqlite> select * from upload_history where maintainer like 
'%debian-med-packaging%' limit 2 ;
e1jawxz-000605...@ries.debian.org|1199391582|gnumed-client|0.2.8.1-1|Andreas 
Tille |Andreas Tille|ti...@debian.org|Debian-Med Packaging 
Team |Debian-Med Packaging 
Team|debian-med-packag...@lists.alioth.debian.org|0|
 gnumed-client (0.2.8.1-1) unstable; urgency=low
 .
   * New upstream version
e1japsm-0006xr...@ries.debian.org|1199462003|probcons|1.12-4|Charles Plessy 
|Charles 
Plessy|charles-debian-nos...@plessy.org|Debian-Med Packaging Team 
|Debian-Med Packaging 
Team|debian-med-packag...@lists.alioth.debian.org|0|
 probcons (1.12-4) unstable; urgency=low
 .
 - Allowed upload by Debian Maintainers.
 - Checked the compliance with Policy 3.7.3
   * debian/patches:
 - swiched to quilt
 - added a fix to build with GCC 4.3 (Closes: #455625)
   * debian/rules:
 - modify Main-RNA.cc so that it uses Defaults-RNA.h (Closes: #458926)
   * debian/copyright:
 - converted to machine-readable format.
 .
   [ David Paleino ]
   * debian/probcons.1, debian/probcons-RNA.1, debian/pc-compare.1,
 debian/pc-makegnuplot.1, debian/pc-project.1 added - these
 have been statically built.
   * debian/control:
 - B-D updated
 - added myself to Uploaders
   * debian/rules:
 - manpages statically built
 - minor changes

But I guess you consider this table partly a debugging state.  I do not
see a good reason to store the full changelog paragraph otherwise.  You
also are storing message_id.  That's OK from a data consumption point of
view but I do not see any real usage for this field at the moment.

I would love to see the same table structure as in UDD:

   source | version | date | changed_by | changed_by_name | changed_by_email | 
maintainer | maintainer_name | maintainer_email | nmu | signed_by | 
signed_by_name | signed_by_email | key_id | distribution | file | fingerprint

What I'm missing is signed_by* .  No idea what key_id means - never used
this.  Distribution might be good to have as well, no idea what file
might have contained.  Fingerprint seems also sensible since it could be
a link to the carnivore table.


Regarding the decision to parse the web archives rather than mboxes: I
don't know what is better.  I agree that accessing public data is an
advantage but if it is at the expense of more complex code I would
rather stick to the mbox parsing.

BTW, formerly the data went at least back to 2000.  Here is the graph
for pkg-perl:

   http://blends.debian.net/liststats/uploaders_pkg-perl.png

Currently you encode date as integer in sqlite so I need to think about how to
translate this.  For my target query I want to do for my talk it would be
comfortable to have date or datetime values.

So far for my review.

Thanks a lot for your work on this.  Its really appreciated!

Kind regards

  Andreas.

On Wed, Aug 19, 2020 at 11:03:40PM -0700, Asheesh Laroia wrote:
> Hi Andreas & Lucas & all,
> 
> Lucas -- I'm making progress on re-implementing this. I'd love your input
> by email or IRC about my approach, but if you're busy, feel free to ignore
> this and I'll mention you again when I submit a patch.
> 
> Andreas -- The codebase at
> https://github.com/paulproteus/debian-devel-changes-history-extractor can
> be run on your system and generate a "upload_history" table. Would you be
> willing to try it out and let me know if it meets your needs?
> 
> The README at the URL above has some information about how to use it.
> 
> https://drive.google.com/drive/folders/1hF_zuc_03m3a_VwOO5hpjp5vETNjVxMx?usp=sharing
> is a Google Drive folder (owned by me) which contains an
> upload_history.sqlite file you can use. This would allow you to query the
> current database without using the code to create it. (Feel free to also
> use the code to create your own DB.)
> 
> I'm happy to discuss by IRC or private email or BTS email what you would
> need next. I do hope to resolve the issues listed in the bug tracker on
> GitHub, but

Re: [UDD] Upload_history table is currently empty

2020-08-14 Thread Andreas Tille
Hi Lucas,

On Thu, Aug 13, 2020 at 11:24:42PM +0200, Lucas Nussbaum wrote:
> 
> Well, why don't you look at the code?

No idea whether it is sensible to answer rhetorical questions.
Since you asked despite you probably know that I'm doing even more than
usual for Debian Med since COVID-19 here some other reasons than the
usual ENOTIME excuse:

 - The development of that code is intransparent to me
 - Someone - whoever it was - just droped
 /srv/udd.debian.org/upload-history/munge_ddc.py.tentative
   at 2020-08-01 which seems to be a Python3 port of munge_ddc.py
 - I had simply hoped that your bug report might have triggered
   an action of the maintainer of that code (whoever this might be)
 - I was afraid to do some naive pocking in some code I need to
   understand first (while other more competent developers are
   just busy to fix things).

In other words: Its not really an inviting environment for potential
helpers.

To at least do something against the latter I injected everything I
categorized as "code or data that come from some source I do not have
any idea where to obtain from" into Git


https://salsa.debian.org/qa/udd/-/commit/a0408b9d03a9e26a775a9525760a02199c306f5d

I repeat that I consider it sensible to develop here in Git and work
with symlinks to /srv/udd.debian.org/upload-history/ - but even this was
not yet confirmed by those people who initially developed the code (nor
those who are working on munge_ddc.py.tentative.

IMHO the next logical step would be to replace the now unusable
munge_ddc.py by munge_ddc.py.tentative set the symlinks and simply
call the procedure to see what happens.  Do you want me to do this
or is there anybody who has a better idea to proceed (which I would
be very happy about).

Kind regards

Andreas.

-- 
http://fam-tille.de



Re: [UDD] Upload_history table is currently empty

2020-08-12 Thread Andreas Tille
Hi again,

On Mon, Aug 03, 2020 at 11:20:21AM +0200, Andreas Tille wrote:
> > > 'munge_ddc.py' has the following issues:
> > > [...]
> > > - it doesn't support xz email archives, so it's broken for recent
> > >   archives
> > 
> > It used to work some months ago because it was relying on a huge
> > debian-devel-changes.current. But ullmann ran out of disk space due to
> > this.
> 
> Argh, to bad that disk space is an issue these days.
>  
> > > Do we have a plan to fix this?  I really need those Uploaders data to 
> > > prepare
> > > my DebConf20 talk.
> > 
> > Given your ongoing effort to port UDD to Python3, I think that the best
> > plan is to do that, and port munge_dcc.py to Python3.
> 
> I'd do some 2to3 and simply start it - but its hard to do this on a
> local box here since it seems to rely on data that are stored on
> ullmann.  I also need to admit that I'm currently not able to spent lots
> of time into it. 

Do you see any way I can help speeding up solving this issue?
I have no idea about the actual code (not even who wrote it since
its not in Git (couldn't we at least move a copy to UDD git and
possibly symlink to it on ullmann to have some version control)
and how to test it locally without breaking anything on ullmann?

Kind regards

  Andreas. 

-- 
http://fam-tille.de



Bug#957717: [Help] pvm: ftbfs with GCC-10

2020-08-12 Thread Andreas Tille
Hi,

while I do not intend to maintain pvm personally some Debian Med package
depend from it.  Thus I like to see bug #957717 fixed but I need help.
I commited some general packaging changes so you can find the last
packaging state in Git[1].  When building this I get the following
output:

cc -DSYSVSIGNAL -DNOWAIT3 -DRSHCOMMAND=\"/usr/bin/rsh\" -DNEEDENDIAN 
-DFDSETNOTSTRUCT -DHASERRORVARS -DHASSTDLIB -DCTIMEISTIMET -DSYSERRISCONST 
-DNOTMPNAM -DSYSVSTR -DUSESTRERROR  -g -O2 
-fdebug-prefix-map=/build/pvm-3.4.6=. -fstack-protector-strong -Wformat 
-Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 
-DRSHCOMMAND="/usr/lib/pvm3/bin/rsh" -DPVMDPATH="pvmd" 
-DPVMDFILE="/usr/bin/pvmd" -DPVM_DEFAULT_ROOT="/usr/lib/pvm3" -DOVERLOADHOST 
-Wl,-z,relro -Wl,-z,now -fPIC -DCLUMP_ALLOC -DSTATISTICS -DTIMESTAMPLOG 
-DSANITY -I/build/pvm-3.4.6/include -DARCHCLASS=\"LINUX64\" -DIMA_LINUX64 -c 
/build/pvm-3.4.6/src/ddpro.c
: warning: "RSHCOMMAND" redefined
: note: this is the location of the previous definition
/build/pvm-3.4.6/src/ddpro.c: In function 'hostfailentry':
/build/pvm-3.4.6/src/ddpro.c:556:3: warning: implicit declaration of function 
'pvmlogprintf' [-Wimplicit-function-declaration]
  556 |   pvmlogprintf("hostfailentry() host %s\n", hp->hd_name);
  |   ^~~~
/build/pvm-3.4.6/src/ddpro.c:561:3: warning: implicit declaration of function 
'pvmlogerror'; did you mean 'pvm_perror'? [-Wimplicit-function-declaration]
  561 |   pvmlogerror("hostfailentry() lost master host, we're screwwwed\n");
  |   ^~~
  |   pvm_perror
/build/pvm-3.4.6/src/ddpro.c:575:3: warning: implicit declaration of function 
'pkint'; did you mean 'printf'? [-Wimplicit-function-declaration]
  575 |   pkint(mp, hosts->ht_serial);
  |   ^
  |   printf
/build/pvm-3.4.6/src/ddpro.c:582:5: warning: implicit declaration of function 
'sendmessage'; did you mean 'sendmsg'? [-Wimplicit-function-declaration]
  582 | sendmessage(mp);
  | ^~~
  | sendmsg
/build/pvm-3.4.6/src/ddpro.c:656:7: warning: implicit declaration of function 
'assign_tasks' [-Wimplicit-function-declaration]
  656 |   assign_tasks(wp);
  |   ^~~~
/build/pvm-3.4.6/src/ddpro.c:682:6: warning: implicit declaration of function 
'free_waitc_add' [-Wimplicit-function-declaration]
  682 |  free_waitc_add((struct waitc_add *)wp->wa_spec);
  |  ^~
/build/pvm-3.4.6/src/ddpro.c:695:5: warning: implicit declaration of function 
'mb_tidy' [-Wimplicit-function-declaration]
  695 | mb_tidy(wp->wa_on);
  | ^~~
/build/pvm-3.4.6/src/ddpro.c:703:5: warning: implicit declaration of function 
'mb_tidy_reset' [-Wimplicit-function-declaration]
  703 | mb_tidy_reset(wp->wa_on);
  | ^
/build/pvm-3.4.6/src/ddpro.c: At top level:
/build/pvm-3.4.6/src/ddpro.c:821:1: warning: return type defaults to 'int' 
[-Wimplicit-int]
  821 | free_waitc_add(wxp)
  | ^~
/build/pvm-3.4.6/src/ddpro.c: In function 'addhosts':
/build/pvm-3.4.6/src/ddpro.c:882:6: warning: implicit declaration of function 
'upkint' [-Wimplicit-function-declaration]
  882 |  if (upkint(mp, ) || count < 1 || count > maxhostid) {
  |  ^~
/build/pvm-3.4.6/src/ddpro.c:903:7: warning: implicit declaration of function 
'upkstralloc' [-Wimplicit-function-declaration]
  903 |   if (upkstralloc(mp, )) {
  |   ^~~
/build/pvm-3.4.6/src/ddpro.c:907:7: warning: implicit declaration of function 
'parsehost' [-Wimplicit-function-declaration]
  907 |   if (parsehost(buf, hp)) {
  |   ^
/build/pvm-3.4.6/src/ddpro.c:917:5: warning: implicit declaration of function 
'applydefaults' [-Wimplicit-function-declaration]
  917 | applydefaults(hp, hp2);
  | ^
: error: 'pvmd' undeclared (first use in this function)
/build/pvm-3.4.6/src/ddpro.c:1031:14: note: in expansion of macro 'PVMDPATH'
 1031 |   pvmdpath = PVMDPATH;
  |  ^~~~
: note: each undeclared identifier is reported only once for each 
function it appears in
/build/pvm-3.4.6/src/ddpro.c:1031:14: note: in expansion of macro 'PVMDPATH'
 1031 |   pvmdpath = PVMDPATH;
  |  ^~~~
/build/pvm-3.4.6/src/ddpro.c:1039:3: warning: implicit declaration of function 
'pkstr' [-Wimplicit-function-declaration]
 1039 |   pkstr(mp2, hp->hd_sopts ? hp->hd_sopts : "");
  |   ^
/build/pvm-3.4.6/src/ddpro.c:1133:5: warning: implicit declaration of function 
'pvmlogperror'; did you mean 'pvm_perror'? [-Wimplicit-function-declaration]
 1133 | pvmlogperror("addhosts() fork");
  | ^~~~
  | pvm_perror
/build/pvm-3.4.6/src/ddpro.c:1142:4: warning: implicit declaration of function 
'beprime' [-Wimplicit-function-declaration]
 1142 |beprime();
  |^~~
/build/pvm-3.4.6/src/ddpro.c:1144:4: warning: implicit declaration of function 
'hoster' [-Wimplicit-function-declaration]
 1144 |hoster(mp2);
  |^~

Re: [UDD] Upload_history table is currently empty

2020-08-03 Thread Andreas Tille
Hi Lucas,

On Mon, Aug 03, 2020 at 11:05:12AM +0200, Lucas Nussbaum wrote:
> > So why does this end in 2013?  Funnily enough when I rsync to my local box I
> > get random zero sized *.xz.out files (but lots of missings - for instance 
> > only
> > debian-devel-changes.201907.xz.out for whole 2019).  I remember that the 
> > Uploaders
> > table made sense some monthes ago so that sounds pretty strange. 
> 
> Yes, as I wrote in the bug report:

Ahhh, may be I'm a but slow in understanding the bug report, sorry
about this.
 
> > 'munge_ddc.py' has the following issues:
> > [...]
> > - it doesn't support xz email archives, so it's broken for recent
> >   archives
> 
> It used to work some months ago because it was relying on a huge
> debian-devel-changes.current. But ullmann ran out of disk space due to
> this.

Argh, to bad that disk space is an issue these days.
 
> > Do we have a plan to fix this?  I really need those Uploaders data to 
> > prepare
> > my DebConf20 talk.
> 
> Given your ongoing effort to port UDD to Python3, I think that the best
> plan is to do that, and port munge_dcc.py to Python3.

I'd do some 2to3 and simply start it - but its hard to do this on a
local box here since it seems to rely on data that are stored on
ullmann.  I also need to admit that I'm currently not able to spent lots
of time into it. 

Kind regards

   Andreas.

-- 
http://fam-tille.de



Re: [UDD] Upload_history table is currently empty

2020-08-03 Thread Andreas Tille
On Mon, Aug 03, 2020 at 08:18:44AM +0200, Lucas Nussbaum wrote:
> > > >https://salsa.debian.org/qa/udd/-/tree/python3
> > > 
> > > ... but it does not include the script in question for this bug?
> > 
> > You mean munge_ddc.py?  Hmmm, I can not even find it in master branch?
> 
> As I wrote in the bug:
> > 'munge_ddc.py' has the following issues:
> > - it's not version-controlled

Well, commiting it to get somehow would be cheap but

   ullmann:/srv/udd.debian.org/tmp/upload-history

seems to have other issues:

ullmann:/srv/udd.debian.org/tmp/upload-history$ ls -l *.out | tail
-rw-rw-r-- 1 udd   uddadm  688973 Aug  2  2012 
debian-devel-changes.201207.gz.out
-rw-rw-r-- 1 udd   uddadm  654586 Sep  2  2012 
debian-devel-changes.201208.gz.out
-rw-rw-r-- 1 udd   uddadm  732282 Okt  2  2012 
debian-devel-changes.201209.gz.out
-rw-rw-r-- 1 udd   uddadm  965837 Nov  2  2012 
debian-devel-changes.201210.gz.out
-rw-rw-r-- 1 udd   uddadm  816334 Dez  2  2012 
debian-devel-changes.201211.gz.out
-rw-rw-r-- 1 udd   uddadm  800340 Jan  2  2013 
debian-devel-changes.201212.gz.out
-rw-r--r-- 1 lucas Debian  800912 Mär 22  2013 
debian-devel-changes.201301.gz.out
-rw-r--r-- 1 lucas Debian  712706 Mär 22  2013 debian-devel-changes.201301.out
-rw-r--r-- 1 lucas Debian  714748 Mär 22  2013 
debian-devel-changes.201302.gz.out
-rw-r--r-- 1 lucas Debian  573184 Mär 22  2013 debian-devel-changes.201303.out

So why does this end in 2013?  Funnily enough when I rsync to my local box I
get random zero sized *.xz.out files (but lots of missings - for instance only
debian-devel-changes.201907.xz.out for whole 2019).  I remember that the 
Uploaders
table made sense some monthes ago so that sounds pretty strange. 

Do we have a plan to fix this?  I really need those Uploaders data to prepare
my DebConf20 talk.

Kind regards

Andreas.

-- 
http://fam-tille.de



Re: [UDD] Upload_history table is currently empty

2020-08-02 Thread Andreas Tille
On Sun, Aug 02, 2020 at 04:09:49PM +0200, Lucas Nussbaum wrote:
> On 02/08/20 at 15:12 +0200, Andreas Tille wrote:
> > Hi,
> > 
> > an untested 2to3 port of UDD code is in python3 branch of
> > 
> >https://salsa.debian.org/qa/udd/-/tree/python3
> 
> ... but it does not include the script in question for this bug?

You mean munge_ddc.py?  Hmmm, I can not even find it in master branch?
I once branched master for a 2to3 run and I'm merging from time to time.
As I said most things (actually those I did not wrote on my own) are
untested.

Kind regards

   Andreas.

-- 
http://fam-tille.de



Re: [UDD] Upload_history table is currently empty

2020-08-02 Thread Andreas Tille
Hi,

an untested 2to3 port of UDD code is in python3 branch of

   https://salsa.debian.org/qa/udd/-/tree/python3

I consider it sensible to switch to this soon and fix the bugs since we
need to switch anyway sooner or later and since we have a critical bug
now that affects feeding a table it should be the right time.

Kind regards

 Andreas.

On Sun, Aug 02, 2020 at 10:47:04AM +0200, Lucas Nussbaum wrote:
> Hi,
> 
> https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=966649
> 
> On 02/08/20 at 10:11 +0200, Andreas Tille wrote:
> > Hi Lucas,
> > 
> > I'm just realising:
> > 
> > udd=> select count(*) from upload_history;
> >  count 
> > ---
> >  0
> > 
> > I admit I have no idea what might be wrong here.
> > 
> > Kind regards
> > 
> >Andreas.
> > 
> > -- 
> > http://fam-tille.de
> > 
> 

-- 
http://fam-tille.de



Bug#963903: 'ftpnew-blends' importer crashing. disabled.

2020-06-29 Thread Andreas Tille
Control: tags -1 unreproducible

Hi Lucas,

On Sun, Jun 28, 2020 at 06:01:33PM +0200, Lucas Nussbaum wrote:
> Traceback (most recent call last):
>   File "/srv/udd.debian.org/udd//udd.py", line 88, in 
> exec "gatherer.%s()" % command
>   File "", line 1, in 
>   File "/srv/udd.debian.org/udd/udd/blends_prospective_gatherer.py", line 
> 407, in run
> upstream = upstream_reader(ufile, source, self.log, sprosp['blend'])
>   File "/srv/udd.debian.org/udd/udd/upstream_reader.py", line 126, in __init__
> self.fields = yaml.safe_load(uf.read())
>   File "/usr/lib/python2.7/dist-packages/yaml/__init__.py", line 93, in 
> safe_load
> return load(stream, SafeLoader)
>   File "/usr/lib/python2.7/dist-packages/yaml/__init__.py", line 71, in load
> return loader.get_single_data()
>   File "/usr/lib/python2.7/dist-packages/yaml/constructor.py", line 37, in 
> get_single_data
> node = self.get_single_node()
>   File "/usr/lib/python2.7/dist-packages/yaml/composer.py", line 43, in 
> get_single_node
> event.start_mark)
> yaml.composer.ComposerError: expected a single document in the stream
>   in "", line 1, column 1:
> Reference:
> ^
> but found another document
>   in "", line 21, column 1:
> ---
> ^

I've tested the importer and can not reproduce this.  May be there is a
broken yaml file included in some just accepted package.  The importer
is catching several of those errors and usually runs smoothly.

Am I understanding you correctly that you deactivated ftpnew gatherer?
If yes, I think it can be activated again and I try to keep a close eye
the next couple of days.  In case you realise such issue in the future
it could be a good idea to keep a copy of
/srv/udd.debian.org/mirrors/ftpnew
somewhere to have the actual data that really caused the crash.

Kind regards and thanks a lot for caring for UDD

 Andreas.

-- 
http://fam-tille.de



Issues with UDD importer [Was: Cron /srv/udd.debian.org/udd/rudd --status]

2020-06-25 Thread Andreas Tille
Hi,

I've realised missings in UDD: nanofilt was recently accepted to Debian
but:

select * from sources where source='nanofilt';
select * from packages where package='nanofilt';

both have no result.

This seems to correlate with the UDD status errors.

Kind regards

  Andreas.

- Forwarded message from Cron Daemon  -

Date: Wed, 24 Jun 2020 12:00:01 +
From: Cron Daemon 
To: lu...@debian.org, ti...@debian.org
Subject: Cron  /srv/udd.debian.org/udd/rudd --status

/usr/lib/ruby/vendor_ruby/json/common.rb:156:in `parse': 740: unexpected token 
at '{ (JSON::ParserError)
  "importers": {
"security-tracker": { "cron":"0 * * * *", "pool":"long-tasks" },
"ci": { "cron":"1 * * * *" },
"migration-excuses": { "cron":"50 * * * *" },
"duck": { "cron":"3 * * * *" },
"wanna-build": {
  "cron":"9 * * * *",
  "pool":"wannabuild"
},
"aptosid": {
  "cron":"30 3 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh aptosid",
},
"debian-popcon": {
  "cron":"42 1,13 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh debian-popcon"
},
"ubuntu-popcon": {
  "cron":"42 1,13 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh ubuntu-popcon"
},
"lintian": {
  "cron":"42 6,18 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh lintian"
},
"piuparts": {
  "cron":"42 6,18 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh piuparts"
},
"ldap": {
  "cron":"42 9,21 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh ldap"
},
"removals": {
  "cron":"42 9,21 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh removals"
},
"carnivore": {
  "cron":"42 9,21 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh carnivore"
},
"ftp-autorejects": {
  "cron":"42 9,21 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh ftp-autorejects"
},
"pseudo-packages": {
  "cron":"42 9,21 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh pseudo-packages"
},
"debian_maintainers": {
  "cron":"42 9,21 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh debian_maintainers"
},
"vcswatch": {
  "cron":"22,52 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh vcswatch"
},
"reproducible": {
  "cron":"42 2 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh reproducible"
},
"key-packages": {
  "cron":"5 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh key-packages"
},
"mentors": {
  "cron":"45 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh mentors"
},
"testing-migrations": {
  "cron":"6 0,12 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh testing-migrations"
},
"testing-autoremovals": {
  "cron":"42 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh testing-autoremovals"
},
"deferred": {
  "cron":"8 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh deferred"
},
"hints": {
  "cron":"35 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh hints"
},
"orphaned-packages": {
  "cron":"12,27,42,57 * * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh orphaned-packages"
},
"screenshots": {
  "cron":"30 3 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh screenshots"
},
"ftpnew-blends": {
  "cron":"40 3 * * *",
  "command":"/srv/udd.debian.org/udd/scripts/cron_ftpnew_blends.sh",
  "pool":"long-tasks"
},
"ddtp": {
  "cron":"30 2 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh ddtp",
  "pool":"long-tasks"
},
"history-daily": {
  "cron":"0 0 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh history-daily"
},
"upload-history": {
  "cron":"49 0,6,12,18 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh upload-history",
  "pool":"long-tasks"
},
"ubuntu-upload-history": {
  "cron":"4 0,6,12,18 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh 
ubuntu-upload-history",
  "pool":"ubuntu"
},
"ubuntu-bugs": {
  "cron":"45 20 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh ubuntu-bugs",
  "pool":"ubuntu"
},
"archive-ubuntu": {
  "cron":"30 4,16 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh archive-ubuntu",
  "pool":"ubuntu"
},
"archive-debian":{
  "cron":"0 0 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh archive-debian",
  "pool":"none"
},
"archive-debian-security":{
  "cron":"0 0 * * *",
  "command":"/srv/udd.debian.org/udd/update-and-run.sh 
archive-debian-security",
  "pool":"none"
},

Re: Started porting UDD to Python3 (Was: [UDD] Is there some effort to port UDD to Python3?)

2020-05-18 Thread Andreas Tille
Hi Stéphane,

thanks for your patch which I applied in the python3 branch.  Unfortunately
it does not solve the issue:


udd(python3) $ ./update-and-run.sh ddtp
Traceback (most recent call last):
  File "/srv/udd.debian.org/udd//udd.py", line 88, in 
exec("gatherer.%s()" % command)
  File "", line 1, in 
  File "/srv/udd.debian.org/udd/udd/ddtp_gatherer.py", line 127, in run
h.update(f.read())
  File "/usr/lib/python3.8/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc5 in position 11: 
invalid continuation byte


Thanks a lot anyway

  Andreas.

On Mon, May 18, 2020 at 01:15:11PM +0200, Stéphane Blondon wrote:
> Hello,
> 
> On 15/05/2020 21:10, Andreas Tille wrote:> Would you mind providing a
> patch with chardet?
> There is a patch attached to this e-mail.
> 
> I used [1] for the base file. I don't think the patch is great (because
> there are two 'open()' calls) but it has minimal modifications of the
> current source code. I think it's a better solution for the success the
> migration to python3 (because it avoid introducing bugs during the
> migration).
> 
> 
> Feel free to ask for more explanations or other stuff if you need.
> 
> 1: https://salsa.debian.org/qa/udd/-/blob/master/udd/ddtp_gatherer.py
> 
> -- 
> Stéphane

> --- ddtp_gatherer.py.orig 2020-05-17 22:54:21.793075000 +0200
> +++ ddtp_gatherer.py  2020-05-18 13:02:47.210764004 +0200
> @@ -25,6 +25,8 @@
>  import logging
>  import logging.handlers
>  
> +import chardet
> +
>  debug=0
>  
>  def get_gatherer(connection, config, source):
> @@ -117,7 +119,7 @@
>trfile = trfilepath + file
># check whether hash recorded in index file fits real file
>try:
> -f = open(trfile)
> +f = _open_file(trfile)
>except IOError, err:
>  self.log.error("%s: %s.", str(err), trfile)
>  continue
> @@ -236,6 +238,13 @@
>  except IOError, err:
>self.log.exception("Error reading %s%s", dir, filename)
>  
> +def _open_file(path):
> +with open(path, 'rb') as f:
> +raw_content = f.read()
> +encoding = chardet.detect(raw_content)["encoding"]
> +return open(path, encoding=encoding)
> +
> +
>  if __name__ == '__main__':
>main()
>  





-- 
http://fam-tille.de



Re: Started porting UDD to Python3 (Was: [UDD] Is there some effort to port UDD to Python3?)

2020-05-15 Thread Andreas Tille
On Fri, May 15, 2020 at 08:51:05PM +0200, Stéphane Blondon wrote:
> > And, ideally, somebody would contact whoever is providing that file so that
> > they re-encode it with utf8...
> 
> Yes, it's the best long term solution.

Definitely.  But who is providing that file?
 
> >> `f = open(trfile, encoding='latin-1')`
> >>
> >> could be a (temporary?) solution.
> 
> Andreas, it's possible that changing the encoding will fix the bug for
> some files but you will get new errors on other files (encoded in
> utf-8). Trying several encoding or using 'chardet' library could be a
> better workaround.

Would you mind providing a patch with chardet?

Kind regards

  Andreas.

-- 
http://fam-tille.de



Started porting UDD to Python3 (Was: [UDD] Is there some effort to port UDD to Python3?)

2020-05-14 Thread Andreas Tille
On Wed, May 13, 2020 at 07:40:56PM +0200, Andreas Tille wrote:
> > 
> > Use the Vagrant development environment?
> 
> I admit I've never worked with this.  You said its pretty simple and I
> would guess a python3 branch where everybody commits the code he feels
> responsible for and test it would be sufficient.  But I'm fine to adapt
> (if you point me to some doc).

I've just followed my proposal to create a python3 branch, fired up 2to3
and fixed some issues manually.  My gatherers blends-prospective, ftpnew
and screenshots should work.  I'm now stumbling upon:

udd(python3) $ ./only-run.sh ddtp
Traceback (most recent call last):
  File "/srv/udd.debian.org/udd//udd.py", line 88, in 
exec("gatherer.%s()" % command)
  File "", line 1, in 
  File "/srv/udd.debian.org/udd/udd/ddtp_gatherer.py", line 125, in run
h.update(f.read())
  File "/usr/lib/python3.8/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc5 in position 11: 
invalid continuation byte


I tried to fix this using this patch:


$ git diff
diff --git a/udd/ddtp_gatherer.py b/udd/ddtp_gatherer.py
index 46e588d..7cc625c 100644
--- a/udd/ddtp_gatherer.py
+++ b/udd/ddtp_gatherer.py
@@ -117,7 +117,7 @@ class ddtp_gatherer(gatherer):
   trfile = trfilepath + file
   # check whether hash recorded in index file fits real file
   try:
-f = open(trfile)
+f = open(trfile, encoding='utf-8')
   except IOError as err:
 self.log.error("%s: %s.", str(err), trfile)
 continue


but it does not help.  Any hint would be welcome.

Kind regards

   Andreas. 

-- 
http://fam-tille.de



Re: [UDD] Is there some effort to port UDD to Python3?

2020-05-13 Thread Andreas Tille
On Wed, May 13, 2020 at 05:09:02PM +0200, Lucas Nussbaum wrote:
> On 13/05/20 at 16:38 +0200, Andreas Tille wrote:
> > Hi Lucas,
> > 
> > On Tue, Apr 14, 2020 at 08:47:11AM +0200, Andreas Tille wrote:
> > > > 
> > > > Not as far as I know. I suspect that, once it becomes necessary, it will
> > > > be easy to do given the codebase is relatively small.
> > > 
> > > I agree that the small code base makes it probably easy.  But I'm
> > > worried about the "once it becomes necessary" part.  We all know that
> > > Python2 is only alive due to our security team and we should actively
> > > work on getting rid of the dependency rather sooner than later.  Working
> > > "under pressure" makes things always uneasy - no matter how easy it
> > > would be in principle.
> > > 
> > > I know probably nobody will stop me from doing it - but I'm hesitating
> > > adding another item on my table which is full of Debian Med - Covid-19
> > > stuff.  I'd volunteer to port those importers I've written myself once
> > > somebody gives the signal - but I'd love if those who have written the
> > > core parts would take the lead (rather sooner than later).
> > 
> > I need to come back to this topic since I like to test the importers on
> > my local machines which are usually running testing.
> 
> Use the Vagrant development environment?

I admit I've never worked with this.  You said its pretty simple and I
would guess a python3 branch where everybody commits the code he feels
responsible for and test it would be sufficient.  But I'm fine to adapt
(if you point me to some doc).

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there some effort to port UDD to Python3?

2020-05-13 Thread Andreas Tille
Hi Lucas,

On Tue, Apr 14, 2020 at 08:47:11AM +0200, Andreas Tille wrote:
> > 
> > Not as far as I know. I suspect that, once it becomes necessary, it will
> > be easy to do given the codebase is relatively small.
> 
> I agree that the small code base makes it probably easy.  But I'm
> worried about the "once it becomes necessary" part.  We all know that
> Python2 is only alive due to our security team and we should actively
> work on getting rid of the dependency rather sooner than later.  Working
> "under pressure" makes things always uneasy - no matter how easy it
> would be in principle.
> 
> I know probably nobody will stop me from doing it - but I'm hesitating
> adding another item on my table which is full of Debian Med - Covid-19
> stuff.  I'd volunteer to port those importers I've written myself once
> somebody gives the signal - but I'd love if those who have written the
> core parts would take the lead (rather sooner than later).

I need to come back to this topic since I like to test the importers on
my local machines which are usually running testing.  I now get a
conflict since python-debian is needed but this can not be installed any
more since it would need python-chardet which in turn conflicts with
latest python3-chardet.  So simply picking from snapshot.d.o is no
option and I think its time to do the Python3 port.  There are code
contributions from:

$ git log --pretty=format:"%an <%ae>" udd/*.py | sed 
's/@3b15d4d3-bb24-0410-9696-dc0fab150647/@debian.org/' | sort | uniq | grep -v 
-e 'Akshita Jha' -e 'Emmanouil Kiagias' -e ^lucas -e ^tille
Andreas Tille 
Bas Couwenberg 
Gianfranco Costamagna 
Ivo De Decker 
kroeckx 
laney 
Lucas Nussbaum 
Mattia Rizzolo 
Ole Streicher 
Paul Wise 
themill-guest 
zack 

(I left out former GSoC students of mine where I can take over the code
as well as duplicates that are obvious to me.)

So how can we organise the Python3 port of the UDD code base?

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-05-02 Thread Andreas Tille
Hi Lucas,

please forget my question - I simply missed the '--run' option.

Sorry for the noise, Andreas.

On Sat, May 02, 2020 at 09:37:43AM +0200, Andreas Tille wrote:
> Hi Lucas,
> 
> On Fri, May 01, 2020 at 10:57:39PM +0200, Lucas Nussbaum wrote:
> > I fixed it and added the same suites as you were, reverted your changes
> > and dropped the autopkgtest table. So everything should be fine.
> 
> Thanks for this.
>  
> > Sorry for not following closely enough when you started to work on that.
> 
> No problem - I've learned something anyway. ;-)
> 
> BTW, if I just want to update my local mirror, how can I do this:
> 
> $ /srv/udd.debian.org/udd/rudd ci
> Traceback (most recent call last):
> /srv/udd.debian.org/udd/rudd:62:in `': No option specified 
> (RuntimeError)
> 
> Kind regards
> 
>   Andreas.
> 
> -- 
> http://fam-tille.de
> 
> 

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-05-02 Thread Andreas Tille
Hi Lucas,

On Fri, May 01, 2020 at 10:57:39PM +0200, Lucas Nussbaum wrote:
> I fixed it and added the same suites as you were, reverted your changes
> and dropped the autopkgtest table. So everything should be fine.

Thanks for this.
 
> Sorry for not following closely enough when you started to work on that.

No problem - I've learned something anyway. ;-)

BTW, if I just want to update my local mirror, how can I do this:

$ /srv/udd.debian.org/udd/rudd ci
Traceback (most recent call last):
/srv/udd.debian.org/udd/rudd:62:in `': No option specified (RuntimeError)

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-28 Thread Andreas Tille
Hi again Lucas,

may be my mail slipped through but basically the difference between the two
importers is that you are just parsing

https://ci.debian.net/data/status/unstable/amd64/packages.json

while the Python script is parsing

https://ci.debian.net/data/status/*/*/packages.json

You even have line 5

# FIXME there might be more suites at some point

so you was aware of that issue.  Would you mind solving that FIXME?  Sorry,
I do not speak Ruby.

Kind regards

 Andreas.

On Tue, Apr 14, 2020 at 06:12:39AM +0200, Andreas Tille wrote:
> Hi Lucas,
> 
> On Mon, Apr 13, 2020 at 10:37:57PM +0200, Lucas Nussbaum wrote:
> > I'm sorry if I haven't paid enough attention. But what is the difference
> > with the 'ci' importer?
> > 
> > https://salsa.debian.org/qa/udd/-/blob/master/rimporters/ci.rb
> 
> I think the problem is that UDD is not documented and I simply did not
> know about the ci table. :-(
> 
> However, when looking at it there is a difference:
> 
> udd=# select status, arch, count(*) from ci group by status, arch order by 
> status, arch;
>  status  | arch  | count 
> -+---+---
>  fail| amd64 |   925
>  neutral | amd64 |  1593
>  pass| amd64 | 10805
>  tmpfail | amd64 |14
> (4 Zeilen)
> 
> 
> udd=# select status, architecture, count(*) from autopkgtest group by status, 
> architecture order by status, architecture;
>  status  | architecture | count 
> -+--+---
>  fail| amd64|  1561
>  fail| arm64|  1471
>  fail| ppc64el  |   711
>  neutral | amd64|  2879
>  neutral | arm64|  1322
>  neutral | ppc64el  |   298
>  pass| amd64| 21373
>  pass| arm64| 4
>  pass| ppc64el  |  2458
>  tmpfail | amd64|11
>  tmpfail | arm64|85
>  tmpfail | ppc64el  | 7
> (12 Zeilen)
> 
> I guess we should merge both and make sure that all data is imported.
> I would never have written a new importer if I would have been aware
> of the existing one - but I do not speak Ruby to fix the existing one.
> 
> Kind regards
>   Andreas.
> 
> > On 11/04/20 at 07:12 +0200, Andreas Tille wrote:
> > > Hi Paul,
> > > 
> > > thanks for the clarification.  This commit
> > > 
> > >
> > > https://salsa.debian.org/qa/udd/-/commit/6a874a89365671dd37a14a9bca25290dc55a1fc9
> > > 
> > > imports the current data.  I will tests this a bit more and than activate
> > > in a cron job as importer.
> > > 
> > > Thanks a lot for your contribution
> > > 
> > >   Andreas.
> > > 
> > > On Fri, Apr 10, 2020 at 09:05:31PM +0200, Paul Gevers wrote:
> > > > Hi Andreas,
> > > > 
> > > > On 09-04-2020 22:53, Andreas Tille wrote:
> > > > > valid_keys = ( 'run_id',
> > > > > #   'created_at',   # Paul Gevers: should be 
> > > > > ignored
> > > > > #   'updated_at',   # Paul Gevers: should be 
> > > > > ignored
> > > > >'suite',
> > > > >'arch',
> > > > >'package', # > should be renamed to 
> > > > > 'source'
> > > > >'version',
> > > > >'trigger',   # usually package.*version
> > > > I expected you to mostly see "" or "migration-reference/0" here, with
> > > > some hand crafted text from random DD's.
> > > > 
> > > > >'status',
> > > > >'requestor', # 'britney', 'debci' or e-mail
> > > > 
> > > > Debian login to be precise, not e-mail.
> > > > 
> > > > >'pin_packages',  # []
> > > > 
> > > > Since a couple of months this json and other pages only show "pure"
> > > > suite runs, to pin_packages is always empty. pin_packages contains which
> > > > packages are taken from another suite than the base suite.
> > > > 
> > > > > #   'worker',   # Paul Gevers: should be 
> > > > > ignored (is 'null' anyway)
> > > > 
> > > > Oh, bug somewhere I guess.
> > > > 
> > > > >'date',
> > > > >'duration_seconds',
> > > > >'last_pass_date',
> > > > >

Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-16 Thread Andreas Tille
Hi Paul,

On Thu, Apr 16, 2020 at 10:30:17AM +0200, Paul Gevers wrote:
> > H, what exactly means "superficial".
> 
> Please read the documentation:
> https://salsa.debian.org/ci-team/autopkgtest/raw/master/doc/README.package-tests.rst

Thanks for this pointer.
 
> > Are all those
> >Testsuite: autopkgtest-pkg-* 
> > 
> > superficial?
> 
> No. Only those that only have superficial tests. E.g. ruby runs the full
> upstream test-suite automatically.

Ahhh!

> > Do they qualify for early testing migration or not?
> 
> Superficial tests *are* neutral, so no.
> 
> > Wouldn't it be more informative to have a fourth category
> > 
> >pass
> >superficial
> >neutral
> >fail
> 
> No, because there are only three states. I.e. superficial and flaky and
> skipped tests all end up meaning the same.
> 
> > My intention was to have a list with packages of our team where either a
> > test is missing or failing.  My idea was that some autopkgtest-pkg-* is
> > "test is not missing".  What is your opinion as debian-ci team about my
> > idea?
> 
> It seems you want to be processing the message field then, but honestly
> if there is an entry, "test is not missing". If there is no entry, "test
> is missing". Failing is "fail". flaky tests are also "test is not
> missing", skipped tests are also "test is not missing". Why would you
> need all those states? You have the "message" field in your UDD schema
> to check why you get the neutral state.

I definitely want to expose the full message (that's why I insisted to
put it into the table).  My intention is to expose only those packages
where some work needs to be done.  Otherwise the list will be to long.
I will think about the plan after receiving your information.

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-16 Thread Andreas Tille
On Thu, Apr 16, 2020 at 09:32:43AM +0200, Paul Gevers wrote:
> Hi Andreas,
> 
> On 15-04-2020 22:45, Andreas Tille wrote:
> > autodep8-python3 PASS (superficial)
> 
> superficial is translated to neutral. As is FAIL (flaky).

H, what exactly means "superficial".  Are all those

   Testsuite: autopkgtest-pkg-* 

superficial?  Do they qualify for early testing migration or not?
Wouldn't it be more informative to have a fourth category

   pass
   superficial
   neutral
   fail

My intention was to have a list with packages of our team where either a
test is missing or failing.  My idea was that some autopkgtest-pkg-* is
"test is not missing".  What is your opinion as debian-ci team about my
idea?

Kind regards

 Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-15 Thread Andreas Tille
Hi,

I think I've found some inconsistency of the autopkgtest data:
packages_unstable_amd64.json contains:

{
  "run_id": 4963349,
  "created_at": "2020-04-13 12:01:58 UTC",
  "updated_at": "2020-04-14 20:25:31 UTC",
  "suite": "unstable",
  "arch": "amd64",
  "package": "dnapi",
  "version": "1.1-1",
  "trigger": null,
  "status": "neutral",
  "requestor": "debci",
  "pin_packages": [

  ],
  "worker": null,
  "date": "2020-04-14 19:48:49 UTC",
  "duration_seconds": 32,
  "last_pass_date": null,
  "last_pass_version": "",
  "message": "No tests in this package or all skipped",
  "previous_status": "neutral",
  "duration_human": "32s"
}


But there is a test available:


udd=# SELECT source, release, testsuite FROM sources WHERE source = 'dnapi' ;
 source | release |   testsuite
+-+
 dnapi  | sid | autopkgtest-pkg-python


and it seems to be run:

   https://ci.debian.net/data/autopkgtest/unstable/amd64/d/dnapi/4963349/log.gz

says:

autopkgtest [19:48:46]: test autodep8-python3: set -e ; for py in $(py3versions 
-r 2>/dev/null) ; do cd "$AUTOPKGTEST_TMP" ; echo "Testing with $py:" ; $py -c 
"import dnapilib; print(dnapilib)" ; done
autopkgtest [19:48:46]: test autodep8-python3: [---
Testing with python3.8:

autopkgtest [19:48:47]: test autodep8-python3: ---]
autopkgtest [19:48:47]: test autodep8-python3:  - - - - - - - - - - results - - 
- - - - - - - -
autodep8-python3 PASS (superficial)
autopkgtest [19:48:47]:  summary
autodep8-python3 PASS (superficial)


So shouldn't this be rather "pass" than "neutral"?

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there some effort to port UDD to Python3?

2020-04-14 Thread Andreas Tille
Hi Lucas,

On Mon, Apr 13, 2020 at 10:40:07PM +0200, Lucas Nussbaum wrote:
> > we all know that Python2 is end of life but several UDD code is using
> > Python2.  Is there any effort to port it to Python3.
> 
> Not as far as I know. I suspect that, once it becomes necessary, it will
> be easy to do given the codebase is relatively small.

I agree that the small code base makes it probably easy.  But I'm
worried about the "once it becomes necessary" part.  We all know that
Python2 is only alive due to our security team and we should actively
work on getting rid of the dependency rather sooner than later.  Working
"under pressure" makes things always uneasy - no matter how easy it
would be in principle.

I know probably nobody will stop me from doing it - but I'm hesitating
adding another item on my table which is full of Debian Med - Covid-19
stuff.  I'd volunteer to port those importers I've written myself once
somebody gives the signal - but I'd love if those who have written the
core parts would take the lead (rather sooner than later).

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-13 Thread Andreas Tille
Hi Lucas,

On Mon, Apr 13, 2020 at 10:37:57PM +0200, Lucas Nussbaum wrote:
> I'm sorry if I haven't paid enough attention. But what is the difference
> with the 'ci' importer?
> 
> https://salsa.debian.org/qa/udd/-/blob/master/rimporters/ci.rb

I think the problem is that UDD is not documented and I simply did not
know about the ci table. :-(

However, when looking at it there is a difference:

udd=# select status, arch, count(*) from ci group by status, arch order by 
status, arch;
 status  | arch  | count 
-+---+---
 fail| amd64 |   925
 neutral | amd64 |  1593
 pass| amd64 | 10805
 tmpfail | amd64 |14
(4 Zeilen)


udd=# select status, architecture, count(*) from autopkgtest group by status, 
architecture order by status, architecture;
 status  | architecture | count 
-+--+---
 fail| amd64|  1561
 fail| arm64|  1471
 fail| ppc64el  |   711
 neutral | amd64|  2879
 neutral | arm64|  1322
 neutral | ppc64el  |   298
 pass| amd64| 21373
 pass| arm64| 4
 pass| ppc64el  |  2458
 tmpfail | amd64|11
 tmpfail | arm64|85
 tmpfail | ppc64el  | 7
(12 Zeilen)

I guess we should merge both and make sure that all data is imported.
I would never have written a new importer if I would have been aware
of the existing one - but I do not speak Ruby to fix the existing one.

Kind regards
  Andreas.

> On 11/04/20 at 07:12 +0200, Andreas Tille wrote:
> > Hi Paul,
> > 
> > thanks for the clarification.  This commit
> > 
> >
> > https://salsa.debian.org/qa/udd/-/commit/6a874a89365671dd37a14a9bca25290dc55a1fc9
> > 
> > imports the current data.  I will tests this a bit more and than activate
> > in a cron job as importer.
> > 
> > Thanks a lot for your contribution
> > 
> >   Andreas.
> > 
> > On Fri, Apr 10, 2020 at 09:05:31PM +0200, Paul Gevers wrote:
> > > Hi Andreas,
> > > 
> > > On 09-04-2020 22:53, Andreas Tille wrote:
> > > > valid_keys = ( 'run_id',
> > > > #   'created_at',   # Paul Gevers: should be ignored
> > > > #   'updated_at',   # Paul Gevers: should be ignored
> > > >'suite',
> > > >'arch',
> > > >'package',   # > should be renamed to 
> > > > 'source'
> > > >'version',
> > > >'trigger',   # usually package.*version
> > > I expected you to mostly see "" or "migration-reference/0" here, with
> > > some hand crafted text from random DD's.
> > > 
> > > >'status',
> > > >'requestor', # 'britney', 'debci' or e-mail
> > > 
> > > Debian login to be precise, not e-mail.
> > > 
> > > >'pin_packages',  # []
> > > 
> > > Since a couple of months this json and other pages only show "pure"
> > > suite runs, to pin_packages is always empty. pin_packages contains which
> > > packages are taken from another suite than the base suite.
> > > 
> > > > #   'worker',   # Paul Gevers: should be 
> > > > ignored (is 'null' anyway)
> > > 
> > > Oh, bug somewhere I guess.
> > > 
> > > >'date',
> > > >'duration_seconds',
> > > >'last_pass_date',
> > > >'last_pass_version',
> > > >'message',   # see below
> > > >'previous_status',
> > > > #   'duration_human',   # Paul Gevers: duration_seconds 
> > > > and duration_human feel double and the former is leaner for in a 
> > > > database
> > > > #   'blame',# Paul Gevers: should be ignored
> > > >  )
> > > > 
> > > > # message can be
> > > > #  $ grep '"message"' packages*.json | sed 's/^.*\.json: *//'  | sort | 
> > > > uniq
> > > > #  "message": "All tests passed"
> > > > -> "status": "pass"
> > > > #  "message": "Could not run tests due to a temporary testbed failure"  
> > > > -> "status": "tmpfail"
> > > > #  "message": "elbrus"  

Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-10 Thread Andreas Tille
Hi Paul,

thanks for the clarification.  This commit

   
https://salsa.debian.org/qa/udd/-/commit/6a874a89365671dd37a14a9bca25290dc55a1fc9

imports the current data.  I will tests this a bit more and than activate
in a cron job as importer.

Thanks a lot for your contribution

  Andreas.

On Fri, Apr 10, 2020 at 09:05:31PM +0200, Paul Gevers wrote:
> Hi Andreas,
> 
> On 09-04-2020 22:53, Andreas Tille wrote:
> > valid_keys = ( 'run_id',
> > #   'created_at',   # Paul Gevers: should be ignored
> > #   'updated_at',   # Paul Gevers: should be ignored
> >'suite',
> >'arch',
> >'package',   # > should be renamed to 'source'
> >'version',
> >'trigger',   # usually package.*version
> I expected you to mostly see "" or "migration-reference/0" here, with
> some hand crafted text from random DD's.
> 
> >'status',
> >'requestor', # 'britney', 'debci' or e-mail
> 
> Debian login to be precise, not e-mail.
> 
> >'pin_packages',  # []
> 
> Since a couple of months this json and other pages only show "pure"
> suite runs, to pin_packages is always empty. pin_packages contains which
> packages are taken from another suite than the base suite.
> 
> > #   'worker',   # Paul Gevers: should be ignored 
> > (is 'null' anyway)
> 
> Oh, bug somewhere I guess.
> 
> >'date',
> >'duration_seconds',
> >'last_pass_date',
> >'last_pass_version',
> >'message',   # see below
> >'previous_status',
> > #   'duration_human',   # Paul Gevers: duration_seconds and 
> > duration_human feel double and the former is leaner for in a database
> > #   'blame',# Paul Gevers: should be ignored
> >  )
> > 
> > # message can be
> > #  $ grep '"message"' packages*.json | sed 's/^.*\.json: *//'  | sort | uniq
> > #  "message": "All tests passed"-> 
> > "status": "pass"
> > #  "message": "Could not run tests due to a temporary testbed failure"  -> 
> > "status": "tmpfail"
> > #  "message": "elbrus"  -> 
> > "status": "tmpfail"
> > #  "message": "Erroneous package"   -> 
> > "status": "fail"
> > #  "message": null  -> 
> > "status": "fail"
> > #  "message": "No tests in this package or all skipped" -> 
> > "status": "neutral"
> > #  "message": "Tests failed",   -> 
> > "status": "fail"
> > #  "message": "Tests failed, and at least one test skipped" -> 
> > "status": "fail"
> > #  "message": "Tests passed, but at least one test skipped" -> 
> > "status": "pass"
> > #  "message": "Unexpected autopkgtest exit code 20" -> 
> > "status": "tmpfail"
> > 
> > I agree that leaving out worker which is really always null makes sense
> > but I tend to leave message since leaving this out looks like loosing
> > information.  I tried to find a relation to status but it seems the same
> > status can result in different messages.  I think just a field in addition
> > will not blow up UDD way more than it recently is - may be I consider
> > a normalised form, but usually UDD is not very normalised at all.
> 
> In general this is the final output from autopkgtest. But, as you see my
> name there, I had to clean up several times and to be able to find those
> back, I added my nick to the message. The list thus may change over time.
> 
> Paul
> 




-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-09 Thread Andreas Tille
Hi Paul,

On Thu, Apr 09, 2020 at 10:44:18AM +0200, Paul Gevers wrote:
> 
> > May be you consider some fields as really restricted to some
> > special applications and nobody would ever consider querying
> > UDD for it?
> 
> Yes, I wouldn't add blame (I don't think we add those anymore, it's
> legacy, I only found two occurrences on amd64), created_at and
> updated_at. I also don't think that worker and message are very useful,
> but one never knows. duration_seconds and duration_human feel double and
> the former is leaner for in a database.

So I probably go with the following keys (in Python syntax):

valid_keys = ( 'run_id',
#   'created_at',   # Paul Gevers: should be ignored
#   'updated_at',   # Paul Gevers: should be ignored
   'suite',
   'arch',
   'package',   # > should be renamed to 'source'
   'version',
   'trigger',   # usually package.*version
   'status',
   'requestor', # 'britney', 'debci' or e-mail
   'pin_packages',  # []
#   'worker',   # Paul Gevers: should be ignored (is 
'null' anyway)
   'date',
   'duration_seconds',
   'last_pass_date',
   'last_pass_version',
   'message',   # see below
   'previous_status',
#   'duration_human',   # Paul Gevers: duration_seconds and 
duration_human feel double and the former is leaner for in a database
#   'blame',# Paul Gevers: should be ignored
 )

# message can be
#  $ grep '"message"' packages*.json | sed 's/^.*\.json: *//'  | sort | uniq
#  "message": "All tests passed"-> 
"status": "pass"
#  "message": "Could not run tests due to a temporary testbed failure"  -> 
"status": "tmpfail"
#  "message": "elbrus"  -> 
"status": "tmpfail"
#  "message": "Erroneous package"   -> 
"status": "fail"
#  "message": null  -> 
"status": "fail"
#  "message": "No tests in this package or all skipped" -> 
"status": "neutral"
#  "message": "Tests failed",   -> 
"status": "fail"
#  "message": "Tests failed, and at least one test skipped" -> 
"status": "fail"
#  "message": "Tests passed, but at least one test skipped" -> 
"status": "pass"
#  "message": "Unexpected autopkgtest exit code 20" -> 
"status": "tmpfail"

I agree that leaving out worker which is really always null makes sense
but I tend to leave message since leaving this out looks like loosing
information.  I tried to find a relation to status but it seems the same
status can result in different messages.  I think just a field in addition
will not blow up UDD way more than it recently is - may be I consider
a normalised form, but usually UDD is not very normalised at all.

I wonder what might be the meaning of pin_packages which is always
equal [].

Kind regards

 Andreas.



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-09 Thread Andreas Tille
Hi Antonio,

greetings to Curitiba. ;-)
Hope you are fine!

On Thu, Apr 09, 2020 at 09:37:41AM -0300, Antonio Terceiro wrote:
> On Thu, Apr 09, 2020 at 10:44:18AM +0200, Paul Gevers wrote:
> > Antonio, what do you think? If we expose the blacklist, I can also have
> > britney consume it.
> 
> I think it could be exposed, yes. I'm currrently working on optimizing
> the generation of all of that data, so I added to my TODO list and will
> sneak that in.

Do you expect new important fields in the json file or do you intend to
change its structure in principle?  It would be stupid for me to write a
gatherer for UDD and once finished I learn that basic things might have
changed.

Kind regards

 Andreas.

-- 
http://fam-tille.de



Re: [UDD] Is there any information about failed autopkgtest in UDD?

2020-04-05 Thread Andreas Tille
Hi Paul,

On Thu, Apr 02, 2020 at 05:20:54PM +0200, Paul Gevers wrote:
> It depends what you look for. If you're concerned about failures that
> will impact migration, the canonical place is:
> https://release.debian.org/britney/excuses.yaml.gz (or the non-zipped
> version) or directly (around 20 minutes earlier) on respighi. This yaml
> includes the links you are looking for.

I've checked this and found some missings for our purpose.  For example
that file contains some instances of deepnano but only as a rdepends of
theano:

$ grep -B11 -A2 deepnano excuses.yaml | grep -v -e ' - null' -e '[wd]-version' 
-e '^  m' | sed -e '/^--/,/arm64:/d' -e '/policy_info/,/autopkgtest:/d'
  item-name: theano
  deepnano:
amd64:
- RUNNING-ALWAYSFAIL
- https://ci.debian.net/status/pending
- https://ci.debian.net/packages/d/deepnano/testing/amd64
arm64:
- RUNNING-ALWAYSFAIL
- https://ci.debian.net/status/pending
- https://ci.debian.net/packages/d/deepnano/testing/arm64
- PASS
- 
https://ci.debian.net/data/autopkgtest/testing/arm64/d/debconf-kde/4796849/log.gz
- https://ci.debian.net/packages/d/debconf-kde/testing/arm64
  deepnano:
amd64:
- RUNNING-ALWAYSFAIL
- https://ci.debian.net/status/pending
- https://ci.debian.net/packages/d/deepnano/testing/amd64
arm64:
- RUNNING-ALWAYSFAIL
- https://ci.debian.net/status/pending
- https://ci.debian.net/packages/d/deepnano/testing/arm64
- PASS
- 
https://ci.debian.net/data/autopkgtest/testing/arm64/d/dask/4786421/log.gz
- https://ci.debian.net/packages/d/dask/testing/arm64
  deepnano:
amd64:
- RUNNING-ALWAYSFAIL
- https://ci.debian.net/status/pending
- https://ci.debian.net/packages/d/deepnano/testing/amd64
arm64:
- RUNNING-ALWAYSFAIL
- https://ci.debian.net/status/pending
- https://ci.debian.net/packages/d/deepnano/testing/arm64

Please excuse my rough parsing of that yaml file.

Now since I became suspicious about deepnano itself when checking tracker

https://tracker.debian.org/pkg/deepnano

it does not even have any debci entry despite deepnano has an autopkgtest.
I admit I consider this a bug in tracker.
 
> If you're more interested in regressions is pure suites (like DDPO is
> showing), than the results are available from
> https://ci.debian.net/data/status/ e.g.
> https://ci.debian.net/data/status/testing/amd64/packages.json for

When greping this file for deepnano I get no hit at all.

> testing/amd64 The URL needs to be constructed:
> https://ci.debian.net/data/autopkgtest/testing/amd64log.gz
> where package_letter is the first character for all packages except
> packages that start with lib, where  are the first four
> characters.

Hmmm, it seems I really need to parse these URLs

https://ci.debian.net/packages/d/deepnano/  [1]

for all source packages (in the same way as for deepnano) which seems to
be no straightforward way.  I wonder whether you could drop some easily
parsable file containing

source  architecturepassversion version_that_has_passed_before

or something like this - probably I've missed some important field.  This
could be importet into UDD and tracker could base on some UDD table that
imports this information.

Do you think it is feasible to put those data somewhere for easy UDD
import?

OK, when reading [1] it says:

This package is currently blacklisted and will not have any new test runs. 

but why? My goal is to assemble information about all Blends packages
featuring an autopkgtest that fails to give people intending to do some
QA work a good place to look for tasks.  I have no idea how I can easily
get this information in a structured way.

Kind regards

   Andreas.

-- 
http://fam-tille.de



[UDD] Is there some effort to port UDD to Python3?

2020-04-01 Thread Andreas Tille
Hi,

we all know that Python2 is end of life but several UDD code is using
Python2.  Is there any effort to port it to Python3.  If not are there
any volunteers to do this?

Kind regards

  Andreas.

-- 
http://fam-tille.de



[UDD] Is there any information about failed autopkgtest in UDD?

2020-04-01 Thread Andreas Tille
Hi,

I intend to enhance the Blends framework by a QA page which also should
include information about failed autopkgtests.  Unfortunately I can not
find any information about this in UDD.  While the source table contains
a field testsuite which tells whether the package has a test or not.
But where can I find whether the test succeeds (and may be a link to the
failed log).

If this information is really missing (and I did not just missed it
inside the current UDD) where is the best source to parse metadata to
fetch this information into UDD?  I have written some UDD importers and
could consider writing an importer.

Kind regards

   Andreas.

[1] my personal agenda item one at

https://salsa.debian.org/med-team/community/2020-covid19-hackathon/-/wikis/Covid-19-hackathon

-- 
http://fam-tille.de



Re: DEP12: debian/upstream/metdata doesn't allow specifying the VCS branch

2020-01-17 Thread Andreas Tille
On Wed, Dec 04, 2019 at 06:05:43PM +, Jelmer Vernooij wrote:
> > 
> > This is another alternative to make things more clear and may be this
> > should be prefered.
> 
> Is there a process for making changes to debian/upstream/metadata
> specification at the moment? Let me know if I should go and edit the
> wiki page.

Sorry, this remained unanswered since a long time.  So, yes, we
have no other method than editing the Wiki page after discussing
here (which you did).

Kind regards

  Andreas.

-- 
http://fam-tille.de



Re: DEP12: debian/upstream/metdata doesn't allow specifying the VCS branch

2019-12-02 Thread Andreas Tille
Hi Jelmer,

On Tue, Dec 03, 2019 at 02:27:58AM +, Jelmer Vernooij wrote:
> The debian/upstream/metadata file spec
> (https://wiki.debian.org/UpstreamMetadata) currently supports
> a "Repository" field, but does not document a way of specifying what
> branch in the repository the upstream sources are in.
> 
> This can be necessary when upstream e.g. has multiple release series
> and the Debian package is tracking just one.
> 
> Would it be possible to allow specification of a branch?

Sure it is possible.  I see your point but I admit for the moment I have
no (machine-readable oriented) use case - so I don't mind personally.

> I can see two
> possible ways of allowing this:
> 
> * Add a new "Branch" field in the YAML file that goes along with the
>   Repository field. E.g.:
> 
> Repository: https://git.samba.org/samba.git
> Branch: 4.7

As I said I would not mind much personally, but just "Branch" is a bit
generic.  In case this form is prefered Repository-Branch sounds a bit
more clear to me.
 
> * Allow specifying the branch in the Repository header somehow. This
>   would be more consistent with what happens for e.g. the packaging
>   metadata headers in debian/control. E.g.:
> 
> Repository: https://git.samba.org/samba.git -b 4.7

This is another alternative to make things more clear and may be this
should be prefered.
 
Kind regards

   Andreas.

-- 
http://fam-tille.de



Bug#924838: Any idea how to fix remote access when trying to build package?

2019-04-22 Thread Andreas Tille
Hi,

any idea how to fix the attempt to access remote location when
trying to build?

Kind regards

   Andreas.

-- 
http://fam-tille.de



Bug#924838: Maintenance of scoop (Was: FTBFS: Could not import extension sphinx.ext.pngmath (exception: No module named pngmath))

2019-04-11 Thread Andreas Tille
Hi,

I pushed a patch for #924838 to Git but unfortunately there is an issue with
the build time test suite that tries to access remoto hosts:

...
   debian/rules override_dh_auto_test
make[1]: Entering directory '/build/scoop-0.7.1.1'
PYBUILD_SYSTEM=custom \
PYBUILD_TEST_ARGS="cd {dir}/test; {interpreter} tests.py" \
dh_auto_test
I: pybuild base:217: cd /build/scoop-0.7.1.1/test; python2.7 tests.py
Traceback (most recent call last):
  File "tests.py", line 30, in 
from tests_parser import TestUtils
  File "/build/scoop-0.7.1.1/test/tests_parser.py", line 1, in 
from scoop import utils
  File "/build/scoop-0.7.1.1/.pybuild/cpython2_2.7_scoop/build/scoop/utils.py", 
line 43, in 
ip for ip in socket.gethostbyname_ex(socket.gethostname())[2]
socket.gaierror: [Errno -3] Temporary failure in name resolution
E: pybuild pybuild:341: test: plugin custom failed with: exit code=1: cd 
/build/scoop-0.7.1.1/test; python2.7 tests.py


I used

export http_proxy='127.0.0.1:9'

as a known workaround but that fails as well.

Before I might dive deeper into this:  Daniel, you orphaned the package.
May be I missed something, but did you asked here on the list whether
there is somebody who intents to take over?  Is there any issue with
that software which makes it less interesting for the Buster release or
something like this?

I have no personal interest in this package at all but I'm doing my
usual QA work for Debian Science owned packages (or those that are at
least relevant for Debian Science, are in Debian Science Git etc.)

Kind regards

   Andreas


-- 
http://fam-tille.de



Bug#920459: toulbar2: What will happen if testing migration takes longer than removal from testing

2019-02-19 Thread Andreas Tille
On Tue, Feb 19, 2019 at 09:07:24AM +0100, Ondřej Surý wrote:
> I believe that it was the case before that if the autoremoval was due a 
> specific RC bug, any activity on that specific bug would reset the timer for 
> autoremoval.

I've thought the same but despite the activity it is not reset (at least
not according to tracker[1] or the autoremovals query[2]).
 
> But it might have changed since… or my memory is failing me.

I think you are right but that has obviously changed (either due to
freeze or in general).

Kind regards

   Andreas.

[1] https://tracker.debian.org/pkg/toulbar2
[2] https://udd.debian.org/cgi-bin/autoremovals.cgi
 
> > On 19 Feb 2019, at 08:46, Andreas Tille  wrote:
> > 
> > Hi,
> > 
> > toulbar2 is
> > 
> >   Marked for autoremoval on 22 February: #916715
> > 
> > However, this bug was closed in
> > 
> > 
> > toulbar2 (1.0.0+dfsg3-1.1) unstable; urgency=medium
> > 
> >  * Non-maintainer upload.
> >  * Add the missing build dependency on zlib1g-dev. (Closes: #916715)
> > 
> > -- Adrian Bunk   Fri, 11 Jan 2019 13:47:51 +0200
> > 
> > 
> > The problem is that the package did not migrated due to #920459 (doxygen
> > currently breaks lots of packages and I wonder in general what will
> > happen with those packages).  I now uploaded
> > 
> > 
> > toulbar2 (1.0.0+dfsg3-2) unstable; urgency=medium
> > ...
> >  * Prevent generation of PDF documentation since otherwise toulbar2 does
> >not build (see bug #920459).  This means should be reverted once doxygen
> >is fixed.
> > ...
> > -- Andreas Tille   Mon, 18 Feb 2019 22:17:10 +0100
> > 
> > 
> > Which enabled the build on all release architectures.
> > 
> > I'm simply wondering what will happen with toulbar2 (and other packages
> > - I'm actually not that much involved in this, it is just a random
> > Debian Science package) once it was removed from testing.  As far as I
> > understood there will be no migrations from unstable to testing any more
> > if there is no version of that package in testing.  Does that mean that
> > the doxygen issues will kick several packages out of Buster or is there
> > any way to prevent this?
> > 
> > Kind regards
> > 
> >Andreas.
> > 
> > -- 
> > http://fam-tille.de
> > 
> 
> 

-- 
http://fam-tille.de



Bug#920459: toulbar2: What will happen if testing migration takes longer than removal from testing

2019-02-18 Thread Andreas Tille
Hi,

toulbar2 is

   Marked for autoremoval on 22 February: #916715

However, this bug was closed in


toulbar2 (1.0.0+dfsg3-1.1) unstable; urgency=medium

  * Non-maintainer upload.
  * Add the missing build dependency on zlib1g-dev. (Closes: #916715)

 -- Adrian Bunk   Fri, 11 Jan 2019 13:47:51 +0200


The problem is that the package did not migrated due to #920459 (doxygen
currently breaks lots of packages and I wonder in general what will
happen with those packages).  I now uploaded


toulbar2 (1.0.0+dfsg3-2) unstable; urgency=medium
...
  * Prevent generation of PDF documentation since otherwise toulbar2 does
not build (see bug #920459).  This means should be reverted once doxygen
is fixed.
...
 -- Andreas Tille   Mon, 18 Feb 2019 22:17:10 +0100


Which enabled the build on all release architectures.

I'm simply wondering what will happen with toulbar2 (and other packages
- I'm actually not that much involved in this, it is just a random
Debian Science package) once it was removed from testing.  As far as I
understood there will be no migrations from unstable to testing any more
if there is no version of that package in testing.  Does that mean that
the doxygen issues will kick several packages out of Buster or is there
any way to prevent this?

Kind regards

Andreas.

-- 
http://fam-tille.de



Bug#921779: Bug#919413: cascade of FTBFS

2019-02-14 Thread Andreas Tille
On Thu, Feb 14, 2019 at 03:16:22PM +0100, Dominique Dumont wrote:
> On Tuesday, 12 February 2019 16:54:12 CET Andreas Tille wrote:
> > I'm
> > not sure how to deal with the jquery.js one since this is potentially an
> > issue with lots of dependencies - I remember discussions about this
> > which I did not followed.
> 
> Fortunately, jquery is available as a Debian package.

Sure it is.  I simply remember some discussions about why doxygen needs its
own jquery.  I'd be really happy if this is not the case any more.
 
Kind regards

   Andreas.

-- 
http://fam-tille.de



Bug#921779: cascade of FTBFS

2019-02-12 Thread Andreas Tille
Control: tags -1 help

Hi Paolo,

in my attempt to see what I can do for #921779 which breaks several
packages I stumbled upon your attempt to adopt doxygen.  Thanks a lot
for this brave intention. ;-)

I realised that the watch file did not work properly - feel free to `git
am` the attached patch.  I also noticed that there are remaining lintian
errors:

E: doxygen source: source-is-missing templates/html/jquery.js line length is 
32401 characters (>512)
N: 
N:The source of the following file is missing. Lintian checked a few
N:possible paths to find the source, and did not find it.
N:
N:Please repack your package to include the source or add it to
N:"debian/missing-sources" directory.
N:
N:If this is a false-positive, please report a bug against Lintian.
N:
N:Please note, that insane-line-length-in-source-file tagged files are
N:likely tagged source-is-missing. It is a feature not a bug.
N:
N:Severity: serious, Certainty: possible
N:
N:Check: cruft, Type: source
N: 
E: doxygen source: source-is-missing templates/html/menu.js line length is 695 
characters (>512)
E: doxygen source: source-is-missing templates/html/svgpan.js line length is 
312 characters (>256)


You should override the latter two since these are false positives.  I'm
not sure how to deal with the jquery.js one since this is potentially an
issue with lots of dependencies - I remember discussions about this
which I did not followed.

Regarding the ratt results[1] the issue

   librostlab: ! LaTeX Error: File 'listofitems.sty' not found.

can be solved by a doxygen-latex Build-Depends - at least I added this
to frobby in Git[2] which helped against this very error but after this it
was running into:

[572]
! Undefined control sequence.
l.211 ...+t-1$. The inner slice will have $(a\text
  {'},b)$, where $a^\prime$ ...

?.
! Emergency stop.
l.211 ...+t-1$. The inner slice will have $(a\text
  {'},b)$, where $a^\prime$ ...

!  ==> Fatal error occurred, no output PDF file produced!



I have no idea how to fix this.

Kind regards

   Andreas.



[1] https://salsa.debian.org/paolog-guest/doxygen/wikis/ratt4
[2] 
https://salsa.debian.org/science-team/frobby/commit/d2fd89875e8491a755ee702e710aa9e003982ee7

-- 
http://fam-tille.de
>From b0a8a6a14c391fbc40489ab6df984435efaba1c4 Mon Sep 17 00:00:00 2001
From: Andreas Tille 
Date: Tue, 12 Feb 2019 16:02:57 +0100
Subject: [PATCH] Fix watch file

---
 debian/changelog | 4 
 debian/watch | 4 ++--
 2 files changed, 6 insertions(+), 2 deletions(-)

diff --git a/debian/changelog b/debian/changelog
index beb9ad6..6ca0d0b 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,5 +1,6 @@
 doxygen (1.8.15-1) unstable; urgency=medium
 
+  [ Paolo Greppi ]
   * doxygen 1.8.15 release. Closes: #920447.
   * Do not produce "Directory Reference" man pages. Closes: #742871.
   * Bump debhelper compat.
@@ -13,6 +14,9 @@ doxygen (1.8.15-1) unstable; urgency=medium
   * Switch to llvm-toolchain-7. Closes: #912799.
   * Make the output of $year reproducible. Closes: #863054
 
+  [ Andreas Tille ]
+  * Fix watch file
+
  -- Paolo Greppi   Tue, 05 Feb 2019 16:45:02 +0100
 
 doxygen (1.8.13-10) unstable; urgency=medium
diff --git a/debian/watch b/debian/watch
index f688d00..2e294c4 100644
--- a/debian/watch
+++ b/debian/watch
@@ -1,4 +1,4 @@
-version=3
+version=4
 
-opts=filenamemangle=s/.+\/Release_(\d\S*)\.tar\.gz/doxygen-$1.tar\.gz/ \
+opts=uversionmangle=s/_/./g,filenamemangle=s/.+\/Release_(\d\S*)\.tar\.gz/doxygen-$1.tar\.gz/ \
   https://github.com/doxygen/doxygen/tags .*/Release_(\d\S*)\.tar\.gz
-- 
2.20.1



[UDD] upstream data exiting with error

2019-01-27 Thread Andreas Tille
Hi,

it seems upstream data is currently not properly gathered.  I checked

./update-and-run.sh upstream
Traceback (most recent call last):
/srv/udd.debian.org/udd/rudd:62:in `': No option specified (RuntimeError)


and realised that I have no idea how this Ruby based importers are
supposed to work - so I have no idea how to fix this.

Kind regards

   Andreas.

- Forwarded message from Cron Daemon  -

Date: Sun, 27 Jan 2019 12:00:06 +
From: Cron Daemon 
To: lu...@debian.org, ti...@debian.org
Subject: Cron  /srv/udd.debian.org/udd/rudd --status

Importers not running recently:
 - security-tracker (last run: 2019-01-27 06:02:47 +)
 - vcswatch (last run: 2019-01-27 10:53:34 +)
 - mentors (last run: 2019-01-27 08:01:37 +)
 - orphaned-packages (last run: 2019-01-27 11:24:40 +)
Importers exiting with errors:
 - upload-history (last: 2019-01-27 00:49:01 +, status: 1, last OK: 
2019-01-16 20:33:03 +)
 - upstream (last: 2019-01-27 10:53:51 +, status: 1, last OK: 2019-01-22 
21:37:13 +)

Detailed status: https://udd.debian.org/udd-status.cgi


- End forwarded message -

-- 
http://fam-tille.de



Re: #705208,ITA: pylibtiff -- wrapper to the libtiff library to Python using ctypes

2018-12-23 Thread Andreas Tille
I'd be happy if you take pylibtiff, Andreas.

On Sun, Dec 23, 2018 at 08:37:55PM +0100, Antonio Valentino wrote:
> Hello,
> I would like to adopt the pylibtiff [1] package because it is a
> dependency of the new satpy that I'm going to package (#917110).
> 
> The pylibtiff package would become part of the Debian GIS project.
> 
> 
> Please let me know it there is something that prevents the adoption.
> 
> [1] https://tracker.debian.org/pkg/pylibtiff
> 
> 
> kind regards
> 
> -- 
> Antonio Valentino
> 
> 
> 

-- 
http://fam-tille.de



Re: What criterion is used for "has autopkgtest and migrates to testing in 2 days"?

2018-09-12 Thread Andreas Tille
On Wed, Sep 12, 2018 at 04:35:00PM +, Niels Thykier wrote:
> 
> The r-cran-snakecase autopkgtests current fails:
> https://ci.debian.net/packages/r/r-cran-snakecase/testing/amd64/

Grrr, there is a missing test-depends: r-cran-purrrlyr which needs to be
packaged.  I wonder why it escaped my attention since I'm using the
pbuilder hook to run autopkgtest ...
 
Kind regards

  Andreas. 

-- 
http://fam-tille.de



What criterion is used for "has autopkgtest and migrates to testing in 2 days"?

2018-09-12 Thread Andreas Tille
Hi,

sorry, not sure whether debian-qa list is correct but Paul is in To
anyway.  The package r-cran-snakecase has two ways to test:

   1. debian/control defines Testsuite: autopkgtest-pkg-r
   2. debian/tests/control.autodep8

(If 1. is given and there is debian/tests/control lintian is unhappy
 - thus I'm using control.autodep8.)

Both do not seem to qualify for speedy testing migration according to

   https://tracker.debian.org/pkg/r-cran-snakecase

Am I missing something?

Kind regards

 Andreas.

-- 
http://fam-tille.de



BTS seems to send bugs to old maintainer of a package but displays new one correctly (Was: Bug#908065: r-cran-adegraphics: autopkgtest regression: dependency versions not properly specified)

2018-09-07 Thread Andreas Tille
Hi,

On Thu, Sep 06, 2018 at 10:31:34PM +0200, Paul Gevers wrote:
> 
> > BTW, do you have any idea why that bug is filed against Debian Med
> > packaging list[1] while the maintainer of the package is
> > r-pkg-t...@alioth-lists.debian.net?
> 
> tracker.d.o?

Hmmm, why should tracker.d.o keep a record where a package was
maintained *before* instead of just taking the Maintainer field of the
current package?  At least the *rendered* information of tracker.d.o[1]
mentions Debian R Packages Maintainers.

The result of this wrong choice of team somehow hides the todo item for
dh-r from the people who should care (I'll forward a summary, but would
like to understand this issue first).  The BTS says on the bug page:

   Maintainer for src:r-cran-adegraphics is Debian R Packages Maintainers 


but it sends the e-mail in contrast to this to debian-med-packaging[3].

Any idea why BTS sends the e-mail to the maintainer that is mentioned
for instance in the package in stable rather than the one in unstable
(which is correctly parsed obviously)?

Kind regards

   Andreas.

[1] https://tracker.debian.org/pkg/r-cran-adegraphics
[2] https://bugs.debian.org/908065
[3] 
https://alioth-lists.debian.net/pipermail/debian-med-packaging/2018-September/065335.html

-- 
http://fam-tille.de



[UDD] carnivore_gatherer.py has trouble parsing Maintainer name containing ','

2018-06-22 Thread Andreas Tille
Hi,

I remember there was some discussion in ',' in maintainer names on
debian-devel list.  I'm now realising that the carnivore_gatherer has
obvious problem with a ',' injected in the maintainer name.


udd=#  SELECT * FROM carnivore_names where name like '%Language Processing%' or 
name like '%Japanese%' or name like '%Adam C. Powell%' or name like '%IV%'; 
  id  |  name  
--+
  765 | "Natural Language Processing "
  765 | Natural Language Processing
 2815 |  IV"
 2815 | Adam C. Powell IV
 2815 |  IV
 2815 | IV
 2815 | "Adam C. Powell
 2815 | Adam C. Powell
 3154 | Japanese"
 3154 | "Natural Language Processing
 4027 | IV
(11 rows)


udd=# SELECT DISTINCT source, maintainer_name, release FROM sources where 
maintainer_name like '%,%' and release = 'sid';
  source  |maintainer_name| release 
--+---+-
 kakasi   | Natural Language Processing, Japanese | sid
 triangle | Adam C. Powell, IV| sid
(2 rows)

I wonder whether the carnivore_gatherer could be convinced to inject
only one ID (preferably also only one spelling of the according
maintainers.

Kind regards

Andreas.

-- 
http://fam-tille.de



Re: Redundant fields in debian/upstream/metadata and possible lintian check

2018-05-27 Thread Andreas Tille
Hi Charles,

On Sun, May 27, 2018 at 04:46:10PM +0900, Charles Plessy wrote:
> thans for raising the issue.  In brief, there have been unsuccessful
> experiments, but I think that we should let people experiment.

Sure.  May be without this experiment we would not yet have an
established way for scientific references.  So your initial
experiment is really appreciated and had great consequences.
 
> Le Wed, May 23, 2018 at 03:39:44PM +0200, Andreas Tille a écrit :
> > 
> > at least in scientific packages the file debian/upstream/metadata is
> > frequently used since it is the established way to specify citations
> > belonging to some software..  The definition of the fields is given in
> > Wiki[1].
> 
> Side note: while in practice, multiple people are subscribed to
> notifications of changes of this page, which means that non-consensual
> additions would be quickly spotted, I think that it would be good to
> clearly mark which fields are broadly used and accepted, and which are
> more experimental or anectodical.

+1
 
> > These data are gathered in UDD[2].  When I inspected the log of the UDD
> > importer I noticed that there are a lot of redundant fields like
> > "Homepage" or "Watch" where we agreed that these fields should not be
> > duplicated in upstream/metadata.
> 
> Indeed, I was hoping that they could in the long term supersede the
> Homepage field of debian/control and the debian/watch file, but it is
> not going to happen anytime soon.

It might be considered as one of several failed attempts to keep
upstream information out of the default packaging files.

> I think that it would have been nice
> to be able to propagate updates of this information to the Debian
> infrastructure without doing a package upload, but...

It can be propagated to UDD - so its half way there.

> I guess that
> somebody else will eventually find a better way to do this.

+1
 
> > There are also typos and freely invented Fields which are not
> > specified on Wiki[1] (like Distributor', 'CRAN', 'Wiki').  I think it
> > makes sense to have some lintian check for this undefined fields.  I
> > think I'll file a wishlist bug about this soon.
> 
> I think that it is important to let people experiemnt and introduce new
> fields.  Nevertheless, typos and fields with too similar names should be
> prevented.

Yes, experiments are perfectly welcome but lintian warnings will not
prevent experimenting.

> Maybe a Lintian check could send a warning for any field
> that has an edit distance of 1 compared with all the "broadly used
> and accepted" fields ?

I have no idea about lintian - so I don't know whether this is to
complex.
 
> > However, before I do I'd like to discuss the fields Name and Contact.
> > DEP8 defines[3] the fields Upstream-Name and Upstream-Contact which are
> > the same values in a file that has a high probability to be properly
> > maintained.
> 
> (You mean DEP 5).

Yes.

> Since the upstream contact address is volatile, I
> think that ideally it would be better placed in the upstream/metadata
> file.  But that may require a change of Policy and other potentially
> very painful discussions...

Good point.
 
> > In the case of r-* packages from CRAN or Bioconductor it can be even
> > automatically updated (via dh-update-R ... its actually not really
> > done but I think this could be implemented easily - dh-make-R at least
> > generates the fields at the time of initial package creation).
> > 
> > I wonder whether we should maintaining duplicated information and thus
> > would like to hear your opinion about orphaning these fields in
> > debian/upstream/metadata.
> 
> Homepage and Watch were already removed from the wiki page some time
> ago, but could come back into an "actively deprecated" list of fields.
> 
> I think that any duplication done by hand for a long time is going to
> create noise and cost time.  But duplications made by automation tools
> such as dh-update-R are potentially useful and sould be more considered
> as "synchronisations" which propagate information from external source
> into the Debian infrastructure.

In any case we should have one source of information - I do not mind
about automatic propagation of it by some tool.  For the moment I think
the removal of these redundant files is a sensible way to go.  If we
consider automatic updates this can be done in a consitent way.

Kind regards

 Andreas.

-- 
http://fam-tille.de



Re: Redundant fields in debian/upstream/metadata and possible lintian check

2018-05-24 Thread Andreas Tille
Hi Dylan,

On Thu, May 24, 2018 at 01:25:50PM +0200, Dylan Aïssi wrote:
> 2018-05-23 15:39 GMT+02:00 Andreas Tille <andr...@an3as.eu>:
> > These data are gathered in UDD[2].  When I inspected the log of the UDD
> > importer I noticed that there are a lot of redundant fields like
> > "Homepage" or "Watch" where we agreed that these fields should not be
> > duplicated in upstream/metadata.  There are also typos and freely
> > invented Fields which are not specified on Wiki[1] (like Distributor',
> > 'CRAN', 'Wiki').  I think it makes sense to have some lintian check for
> > this undefined fields.  I think I'll file a wishlist bug about this
> > soon.
> 
> There is already an open bug #731340 for this [1], so it should be
> better to make noise in this one. The bug report contains a patch to
> check valid fields but it needs to be modified before to be merged
> into lintian (currently, I don't have time to do it).

Thanks for refreshing my memory (which I vaguely remember is not the
first time you did. ;-) )

> > However, before I do I'd like to discuss the fields Name and Contact.
> > DEP8 defines[3] the fields Upstream-Name and Upstream-Contact which are
> > the same values in a file that has a high probability to be properly
> > maintained.  In the case of r-* packages from CRAN or Bioconductor it
> > can be even automatically updated (via dh-update-R ... its actually not
> > really done but I think this could be implemented easily - dh-make-R at
> > least generates the fields at the time of initial package creation).
> >
> > I wonder whether we should maintaining duplicated information and thus
> > would like to hear your opinion about orphaning these fields in
> > debian/upstream/metadata.
> >
> 
> Pretty off topic, but more generally, should we transfer some
> information (i.e. registry references, publications, etc) from the
> d/upstream/metadata to the AppStream cross-distro file [2]?
> Charles suggested last year to push registry references into the
> AppStream file [3]. I opened a bug to add a field for this in the
> AppStream spec [4] two weeks ago but no response yet (CCing Matthias
> as AppStream upstream to get his opinion).

I do not consider this off topic.  In general I'm in absolute favour of
providing the data we gathered also for other distributions.  I
personally have no idea about AppStream, but if there is any clear path
from upstream/metadata to AppStream I'd follow this.  All applications I
know are based on the UDD data and switching the UDD gatherer from
upstream/metadata to AppStream should be no big deal.

Kind regards

Andreas.

[1] https://bugs.debian.org/731340
[2] https://appstream.debian.org/
[3] https://lists.debian.org/debian-med/2017/08/msg00022.html
[4] https://github.com/ximion/appstream/issues/189

-- 
http://fam-tille.de



Redundant fields in debian/upstream/metadata and possible lintian check

2018-05-23 Thread Andreas Tille
Hi,

at least in scientific packages the file debian/upstream/metadata is
frequently used since it is the established way to specify citations
belonging to some software..  The definition of the fields is given in
Wiki[1].

These data are gathered in UDD[2].  When I inspected the log of the UDD
importer I noticed that there are a lot of redundant fields like
"Homepage" or "Watch" where we agreed that these fields should not be
duplicated in upstream/metadata.  There are also typos and freely
invented Fields which are not specified on Wiki[1] (like Distributor',
'CRAN', 'Wiki').  I think it makes sense to have some lintian check for
this undefined fields.  I think I'll file a wishlist bug about this
soon.

However, before I do I'd like to discuss the fields Name and Contact.
DEP8 defines[3] the fields Upstream-Name and Upstream-Contact which are
the same values in a file that has a high probability to be properly
maintained.  In the case of r-* packages from CRAN or Bioconductor it
can be even automatically updated (via dh-update-R ... its actually not
really done but I think this could be implemented easily - dh-make-R at
least generates the fields at the time of initial package creation).

I wonder whether we should maintaining duplicated information and thus
would like to hear your opinion about orphaning these fields in
debian/upstream/metadata.

Kind regards

   Andreas.

[1] https://wiki.debian.org/UpstreamMetadata#Fields
[2] https://wiki.debian.org/UltimateDebianDatabase/
[3] https://www.debian.org/doc/packaging-manuals/copyright-format/1.0/

-- 
http://fam-tille.de



Re: What to do with the "Upstream info" links in the PTS and the packages-metadata branch in collab-qa ?

2018-05-23 Thread Andreas Tille
On Wed, May 16, 2018 at 09:56:11AM +0800, Paul Wise wrote:
> > I can not parse this?  The UDD gatherer is not using
> > upstream-metadata.debian.net.  It is rather reading Git repositories of
> > selected Blends teams (formerly on Alioth now rewritten for Salsa[1]).
> > The UDD gatherer consumes more data than only debian/upstream/metadata
> > but also about prospective packages.  It does not depend from
> > umegaya nor upstream-metadata.debian.net.
> 
> I was mislead by udd/upstream_reader.py from UDD saying it uses
> upstream-metadata.debian.net, probably that comment needs to be
> removed.

I've updated the comment in Git[1].  Hope this avoids wrong assumptions
in the future and sorry for being that sloppy with docs. :-(
 
> > May be vcswatch and fetch-machine-readable_salsa[1] should be merged.
> > I'm pretty sure that there is a lot room for enhancement at least for
> > the latter.
> 
> Possibly, but you mentioned the latter also scans packages not yet in
> Debian so the vcswatch maintainer would have to be convinced that
> feature is a good idea.

May be we should at least start talking about it.  DebConf 18 could be
a good opportunity for this.
 
> > For the Blends purpose Contents-source files are not sufficient since
> > also Vcs of non-released packages is read and we consume always latest
> > changes from Git and not from possibly way outdated uploaded packages.
> 
> Not every package has a VCS (and some never will) and some of those
> have upstream metadata files, so Contents-source scanning is needed
> anyway.

I'm very picky about having those Blends related packages in VCS to have
at least those upstream files where the concept was invented for
(bibliographic references) in VCS.  But you are definitely correct that
there are packages with upstream files outside VCS (and if you ask me
its a shame ;-) ).

Kind regards

 Andreas.

[1] 
https://salsa.debian.org/qa/udd/commit/a6a9c3bebbba4ce9f08307aeea00b433466f5070 

-- 
http://fam-tille.de



Re: What to do with the "Upstream info" links in the PTS and the packages-metadata branch in collab-qa ?

2018-05-15 Thread Andreas Tille
Hi Paul,

On Thu, May 10, 2018 at 01:46:46PM +0800, Paul Wise wrote:
> 
> > The Debian Med team includes upstream information from these files
> > in its "tasks" pages (see 
> > for example), using the UDD as a gatherer.
> 
> This is using upstream-metadata.debian.net, seems like it will need
> updating if a replacement shows up.

I can not parse this?  The UDD gatherer is not using
upstream-metadata.debian.net.  It is rather reading Git repositories of
selected Blends teams (formerly on Alioth now rewritten for Salsa[1]).
The UDD gatherer consumes more data than only debian/upstream/metadata
but also about prospective packages.  It does not depend from
umegaya nor upstream-metadata.debian.net.
 
> Update vcswatch to scan each repo for upstream metadata files and
> export those.

May be vcswatch and fetch-machine-readable_salsa[1] should be merged.
I'm pretty sure that there is a lot room for enhancement at least for
the latter.

> Update the UDD gatherer to use the Contents-source files
> from the local mirror and extract upstream metadata files. Update the
> UDD gatherer to import the vcswatch results too.

For the Blends purpose Contents-source files are not sufficient since
also Vcs of non-released packages is read and we consume always latest
changes from Git and not from possibly way outdated uploaded packages.

Kind regards

  Andreas.

[1] 
https://salsa.debian.org/blends-team/website/blob/master/misc/machine_readable/fetch-machine-readable_salsa.py

-- 
http://fam-tille.de



Re: What to do with the "Upstream info" links in the PTS and the packages-metadata branch in collab-qa ?

2018-05-15 Thread Andreas Tille
On Tue, May 08, 2018 at 06:02:36PM +1000, Stuart Prescott wrote:
> Are there any other consumers of these files?

Definitely.
 
> If there are no longer any consumers, should we really be nagging 
> maintainers to include them?
> 
> $ lintian-info --list-tags | grep upstream-metadata
> upstream-metadata-file-is-missing
> upstream-metadata-is-not-a-file
> upstream-metadata-yaml-invalid
> 
> While the last two still make sense (if the file is there, make sure it is 
> valid), does the first one?

I agree that the first one is not needed.

> I've never been a fan of lintian suggesting that 
> maintainers add a file that for most(?) packages, is not going to contain 
> anything that is not duplicated information from elsewhere in the package.
> 
> (If there are other users of this information, great...)

The file debian/upstream/metadata is described in wiki[1].  The
definition was cleaned up to not contain redundant information
any more.  Several of its data (Reference, Registry and others)
are read and injected into UDD.

Kind regards

  Andreas.

[1] https://wiki.debian.org/UpstreamMetadata

-- 
http://fam-tille.de



Re: on diverging memberships between alioth/salsa and unix group

2018-03-14 Thread Andreas Tille
On Wed, Mar 14, 2018 at 03:54:27PM +0100, Lucas Nussbaum wrote:
> 
> As a data point, I also added Andreas to the qa group (as Developer) after
> he asked in private mail and I didn't check debian-qa@ ;)

I admit I have read QA later than your mail and just thought it might be
a sensible input to mention the feature I assumed missing in the Salsa
group here.
 
> I have no opinion on whether we should add people to projects or to the
> qa group.

The fact that I read QA not very regularly and that in most cases I do
commits for UDD anyway.  So it would reflect reality if you give me only
permission to this specific project.  However, I do not think I'll
misuse my general permissions. ;-)

Kind regards

 Andreas.

-- 
http://fam-tille.de



Re: on diverging memberships between alioth/salsa and unix group

2018-03-14 Thread Andreas Tille
On Tue, Mar 06, 2018 at 11:28:41AM +0100, Mattia Rizzolo wrote:
> 
> The salsa 'qa' group is very much not for maintaining package, much less
> if you are actually planning on adopting it (you haven't ITAed it yet).
> Please put it in the 'debian' group (i.e., the former collab-maint), or
> elsewhere.

Is there any reason why the "Request membership" button is lacking for
the qa team on Salsa?

Kind regards

  Andreas (who intends to commit some patches for UDD)

-- 
http://fam-tille.de



Accepted id3v2 0.1.12+dfsg-1 (source) into unstable

2018-02-01 Thread Andreas Tille
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256

Format: 1.8
Date: Thu, 01 Feb 2018 13:24:36 +0100
Source: id3v2
Binary: id3v2
Architecture: source
Version: 0.1.12+dfsg-1
Distribution: unstable
Urgency: medium
Maintainer: Debian QA Group <packa...@qa.debian.org>
Changed-By: Andreas Tille <ti...@debian.org>
Description:
 id3v2  - command line id3v2 tag editor
Changes:
 id3v2 (0.1.12+dfsg-1) unstable; urgency=medium
 .
   * QA upload.
   * Remove binary files and .git dir to enable importing import via gbp
   * cme fix dpkg-control
   * Moved packaging from SVN to Git
   * debhelper 11
Checksums-Sha1:
 69c7a16069318e42d9b219a6567b351c773a6327 1897 id3v2_0.1.12+dfsg-1.dsc
 906cbd25097280bc4f9748314a46bb742487ea57 23080 id3v2_0.1.12+dfsg.orig.tar.xz
 1495ec0fc48e3d7454d54bf799bb29586dcef414 9788 id3v2_0.1.12+dfsg-1.debian.tar.xz
Checksums-Sha256:
 a91e197d4fcd34e5a443f148a553e338b7640f566ba7ee36f2532ad51223720f 1897 
id3v2_0.1.12+dfsg-1.dsc
 2b4c2bf2ceda9a829bec7ece9ebaade493ad0861256e90ae84b6380be66a1a3b 23080 
id3v2_0.1.12+dfsg.orig.tar.xz
 fc8d8927ce4876d47d7f19b0ad86833ead83e73d63d9e2ac6292e6cc2d9e3274 9788 
id3v2_0.1.12+dfsg-1.debian.tar.xz
Files:
 3adb5bee9b18ec05ea236e7b80d379c5 1897 sound optional id3v2_0.1.12+dfsg-1.dsc
 71f23ff0a2d129c637d6f46578899ed9 23080 sound optional 
id3v2_0.1.12+dfsg.orig.tar.xz
 5f391ba179a4d905db9fa30cd25189ff 9788 sound optional 
id3v2_0.1.12+dfsg-1.debian.tar.xz

-BEGIN PGP SIGNATURE-

iQJFBAEBCAAvFiEE8fAHMgoDVUHwpmPKV4oElNHGRtEFAlpzCYgRHHRpbGxlQGRl
Ymlhbi5vcmcACgkQV4oElNHGRtGiNA//UZ4v42c/bscUjr8qbIak8DsFdnNNDG4m
cRSCma7LP9O4F4PQxiXtZQOx2aa9L248I7R4yLbNxUyzAtCyZmXBPKQhLaYvRSss
XwkuOs3bf9LjTZz/xgTErbDtlB1xxUJ+9kouThXfH8q7kBmWVuSoux62+p4YCLWK
7JDSv4JTkRJsiNvCA3eXzRFCLPm355OfntnY+aWVxUjNmqyMM2kZARA/Z5eQNsXL
NHpJ/Goe3waOTVe5zjf0z1pH3Ei4LHtnEhVOru8Tb+TiyeYQjeGJhxxMeC2QH2SH
6Tlth60+Qcv3pnmyHA8/ODB4UWoae2mTWI63BGOUlYKTUWkG/CYTgO2/vWMFMIgW
4Evh4cUy5nT6lSf+1ggdWVeibryrt1VBtqpr9xhSu3SrHgQ7cIjPLQaKm9mwdXzE
ooFu0ItJw0tbfHHPWxfDuUlnrvVe8tPdUbxiHfoycX13vTCUcCffMzmGU7BoKc96
czY7gQ0zDZm1gE16+LGLtONMNBML5kUUMCH17N5bT4vUwZ9xX++t+sX1ZGCrS7EJ
Q7I8FTUQk4OHzxbqKaq5QWdxZs/2F25nOxQsm5QJlr8AvUjA2p+AMqztSQW6J+3y
odfVK6bCf1GPRbCd3YwbyIvC139ZUuyuD9C9q+KWoEpx6XHJY+BG5fORf0HNUagK
/xznrgDKFmM=
=4CyL
-END PGP SIGNATURE-



Accepted libavg 1.8.2-1 (source) into unstable

2018-01-31 Thread Andreas Tille
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256

Format: 1.8
Date: Wed, 31 Jan 2018 10:40:52 +0100
Source: libavg
Binary: python-libavg
Architecture: source
Version: 1.8.2-1
Distribution: unstable
Urgency: medium
Maintainer: Debian Science Maintainers 
<debian-science-maintain...@lists.alioth.debian.org>
Changed-By: Andreas Tille <ti...@debian.org>
Description:
 python-libavg - High-level development platform for media-centric applications
Changes:
 libavg (1.8.2-1) unstable; urgency=medium
 .
   * Team upload.
   * New upstream version
   * Moved packaging from SVN to Git
   * Moved packaging to Debian Science team (no real Uploader yet)
   * d/watch pointing to Github
   * cme fix dpkg-control
   * debhelper 11
   * hardening=+all
Checksums-Sha1:
 e7c5a5a15293ee05db2e81504cb71404684a1e02 2300 libavg_1.8.2-1.dsc
 03e4361ddcfd27fcecfdb8d116d3172164852fa1 11094621 libavg_1.8.2.orig.tar.gz
 980faa7babc250820f48b8a0b4a4f92413da620c 13432 libavg_1.8.2-1.debian.tar.xz
Checksums-Sha256:
 ac447741abcac04e8b66cdde3eb83930540abd6bcd2a9f8f0dbf6f200af2608f 2300 
libavg_1.8.2-1.dsc
 ad43757ea209b17ba2c79d23a34ddecc8f70bc18a6b314ccc52b9be50443a493 11094621 
libavg_1.8.2.orig.tar.gz
 32b8bad1c6eecedd3a01aff69007d34eb799d9e733b56a212afce9fc71f0781d 13432 
libavg_1.8.2-1.debian.tar.xz
Files:
 a6a927fd50550eb7c654273228c6a9b4 2300 python optional libavg_1.8.2-1.dsc
 4f9d8511cb2c778a9800de0626c45c99 11094621 python optional 
libavg_1.8.2.orig.tar.gz
 4c9a1acfa5321604664c3b2c92777b6c 13432 python optional 
libavg_1.8.2-1.debian.tar.xz

-BEGIN PGP SIGNATURE-

iQJFBAEBCAAvFiEE8fAHMgoDVUHwpmPKV4oElNHGRtEFAlpxkycRHHRpbGxlQGRl
Ymlhbi5vcmcACgkQV4oElNHGRtGXSxAAiApzPXaMWso4EYAwgTMYREsjM2iyPH1D
4bv4S4lTQ29odcbKN3jNSMcdFnd+TTPAfMw4RBrAxIo6JpUXAzbRJEVjrFf6z8SO
sAyUyFpxEDlPoTJLLdLfh2hvhR3zZVjDh7ghHMYUaPN3uLSTf55zunoR+52ggDHL
rnM2JZoVfwSziepADR6cI42ZGVrqn/CG1NMbBAJC9vgCTSeInCqH2IshlmERVhUi
APfAlBfTadfs/3HGi8rUOZA+f6TMg9FtcsXYC77pgEm7O3Pj7wFIf0UOq15No6UU
KZcxm/6tBbLTjuCrZxpAE4wMzL75/rScFUE+2hTRvV5ZrIky+4XxnD8S5i/9M8bd
94QHBXmH/4b6u9Zbsjzk6zECYJ27DQMTAb15vNShEn5ru/w5ngSYCcrCcN1r9z2O
LoSeFgLXznY+WEhZHJ5wKCJ2g07b/UEey6FXgpHU9BoBID+Sv2WTEvAKCU+1hPA7
gTSA/mcC+mknuOHFZfbt/AD8gJ1rjUvgC4yl2eWF/zm02WDQsO196HPl5THIRpS5
vCYoTeMifCPi2mv27VEaV1W2COgnwW0LO2cMAYIo0irhaXxR9+kfiwfy8yUlpuUi
vsGIsQM26R5ItvOy3PwbruIlJi0TVi65HHQc6ZSKzzHqX1v3BzDiBHrRcLyAAj8o
6Frn2aHpxyo=
=8jPe
-END PGP SIGNATURE-



  1   2   3   4   5   6   >