Re: Re-approaching package tagging

2018-12-18 Thread swedebugia

On 2018-12-19 07:51, swedebugia wrote:

On 2018-12-18 08:48, Catonano wrote:



Il giorno lun 17 dic 2018 alle ore 22:10 swedebugia 
mailto:swedebu...@riseup.net>> ha scritto:


    Hi :)

    On 2018-12-17 20:01, Christopher Lemmer Webber wrote:
 > Hello,
 >
 > In the past when we've discussed package tagging, I think Ludo'
    has been
 > against it, primarily because it's a giant source of 
bikeshedding.  I

 > agree that it's a huge space for bikeshedding... no space
    provides more
 > bikeshedding than naming things, and tagging things is a many 
to many

 > naming system.
 >
 > However, I will say that finding packages based on topical
    interest is
 > pretty hard right now.  If I want to find all the available
    roguelikes:
 >
 > cwebber@jasmine:~$ guix package -A rogue
 > hyperrogue    10.5    out     gnu/packages/games.scm:3652:2
 > roguebox-adventures   2.2.1   out
 gnu/packages/games.scm:1047:2

 >
 > Hm, that's strange, there's definitely more roguelikes that
    should show
 > up than that!  A more specific search is even worse:
 >
 > cwebber@jasmine:~$ guix package -A roguelike
 > cwebber@jasmine:~$
 >
 > What I should have gotten back:
 >   - angband
 >   - cataclysm-dda
 >   - crawl
 >   - crawl-tiles
 >   - hyperrogue
 >   - nethack
 >   - roguebox-adventures
 >   - tome4
 >
 > So I only got 1/4 of the entries I was interested in in my first
    query.
 > Too bad!
 >
 > I get that we're opening up space for bikeshedding and *that's 
true*.

 > But it seems like not doing so makes things hard on users.
 >
 > What do you think?  Is there a way to open the (pandora's?) box
    of tags
 > safely?

    Yes and no.

    Pjotr and I have discussed this relating to biotech software. He said
    that many scientists have a hard time finding the right tools for
    the job.

    I proposed tight integration with wikidata[1] (every software in the
    world will eventually have an item there) and Guix (QID on every
    package
    and lookup/catogory integration) and leave all the categorizing to
    them.
    Ha problem sidestepped, they are bikeshedding experts over there in
    wikiland! :D

    The advantage of this is that everyone using wikidata (every package
    manager) could pull the same categorization so we only do it once 
in a

    central

    What do you think?

    --


There is also the Free Software Directory
https://directory.fsf.org/wiki/Main_Page

I don't know what the relationship between Wikidata and the FSD is

Does Wikidata import data from the FSD ? Or viceversa ?



I don't know. For now at least they keep reference to the FSD on 
software-entries that exists in the FSD.


We could integrate the FSD also but I have yet to investigate if they 
provide an API for their entries.


Anyways I view FSD as a subset of Wikidata/Wikipedia. Wikidata is the 
node and FSD the leaf. Wikidata/Wikipedia will probably within a few 
years contain the data or links to the data that now exists in the FSD.


Correct me if I'm wrong but the only advantage of FSD over Wikidata & 
Wikipedia is that they do not include references to proprietary software 
at all.


In my view it is more feasible to compile the information on in a 
structured way in central node and then pull the relevant bits to the leaf.


E.g. FSD of the future could be generated from all wikidata-entries and 
extracts of wikipedia that are an instance of 
https://www.wikidata.org/wiki/Q341. This would avoid fragmentation and 
help concentrate on building a large shared collective source of all 
knowledge within the wiki-community. FSD could exist anyhow and surely 
help enrich the upstream data.


Similarly we could generate a wikipedia subset without any entries 
pointing to (evil) private corporations (any entries that is part of 
https://www.wikidata.org/wiki/Q5621421 or whatever). I can't imagine 
what this would be good for but it its possible.


I cannot imagine that the information in FSD would not be accepted in 
any of the wikimedia projects. I could be wrong though as I honestly did 
not visit or study the FSD very much.




Also the license of the FSD (GFDL 1.2) differs from both Wikidata (CC0) 
and Wikipedia (CC-BY-SA 4.0 + GFDL 1.2).


This is not to their advantage in the long run.

I fear the FSD is already becoming unmaintained and obsolete with people 
favoring more open and smarter solutions from the wikimedia-projects (at 
least I am).


When it comes to completeness we have at least 500.000 packages missing 
in both Wikidata and FSD (450.000+ MIT & CC0 licensed npm packages). 
Would any of you like to import those twice? I don't and as I see it 
Wikidata is far superior in multiple ways to get the job done and do it 
well with a big community backing it up with tools, bots, manual edits, 
et all. Who wants to update with new versions in two 

Re: Re-approaching package tagging

2018-12-18 Thread swedebugia

On 2018-12-18 08:48, Catonano wrote:



Il giorno lun 17 dic 2018 alle ore 22:10 swedebugia 
mailto:swedebu...@riseup.net>> ha scritto:


Hi :)

On 2018-12-17 20:01, Christopher Lemmer Webber wrote:
 > Hello,
 >
 > In the past when we've discussed package tagging, I think Ludo'
has been
 > against it, primarily because it's a giant source of bikeshedding.  I
 > agree that it's a huge space for bikeshedding... no space
provides more
 > bikeshedding than naming things, and tagging things is a many to many
 > naming system.
 >
 > However, I will say that finding packages based on topical
interest is
 > pretty hard right now.  If I want to find all the available
roguelikes:
 >
 > cwebber@jasmine:~$ guix package -A rogue
 > hyperrogue    10.5    out     gnu/packages/games.scm:3652:2
 > roguebox-adventures   2.2.1   out     gnu/packages/games.scm:1047:2
 >
 > Hm, that's strange, there's definitely more roguelikes that
should show
 > up than that!  A more specific search is even worse:
 >
 > cwebber@jasmine:~$ guix package -A roguelike
 > cwebber@jasmine:~$
 >
 > What I should have gotten back:
 >   - angband
 >   - cataclysm-dda
 >   - crawl
 >   - crawl-tiles
 >   - hyperrogue
 >   - nethack
 >   - roguebox-adventures
 >   - tome4
 >
 > So I only got 1/4 of the entries I was interested in in my first
query.
 > Too bad!
 >
 > I get that we're opening up space for bikeshedding and *that's true*.
 > But it seems like not doing so makes things hard on users.
 >
 > What do you think?  Is there a way to open the (pandora's?) box
of tags
 > safely?

Yes and no.

Pjotr and I have discussed this relating to biotech software. He said
that many scientists have a hard time finding the right tools for
the job.

I proposed tight integration with wikidata[1] (every software in the
world will eventually have an item there) and Guix (QID on every
package
and lookup/catogory integration) and leave all the categorizing to
them.
Ha problem sidestepped, they are bikeshedding experts over there in
wikiland! :D

The advantage of this is that everyone using wikidata (every package
manager) could pull the same categorization so we only do it once in a
central

What do you think?

-- 




There is also the Free Software Directory
https://directory.fsf.org/wiki/Main_Page

I don't know what the relationship between Wikidata and the FSD is

Does Wikidata import data from the FSD ? Or viceversa ?



I don't know. For now at least they keep reference to the FSD on 
software-entries that exists in the FSD.


We could integrate the FSD also but I have yet to investigate if they 
provide an API for their entries.


Anyways I view FSD as a subset of Wikidata/Wikipedia. Wikidata is the 
node and FSD the leaf. Wikidata/Wikipedia will probably within a few 
years contain the data or links to the data that now exists in the FSD.


Correct me if I'm wrong but the only advantage of FSD over Wikidata & 
Wikipedia is that they do not include references to proprietary software 
at all.


In my view it is more feasible to compile the information on in a 
structured way in central node and then pull the relevant bits to the leaf.


E.g. FSD of the future could be generated from all wikidata-entries and 
extracts of wikipedia that are an instance of 
https://www.wikidata.org/wiki/Q341. This would avoid fragmentation and 
help concentrate on building a large shared collective source of all 
knowledge within the wiki-community. FSD could exist anyhow and surely 
help enrich the upstream data.


Similarly we could generate a wikipedia subset without any entries 
pointing to (evil) private corporations (any entries that is part of 
https://www.wikidata.org/wiki/Q5621421 or whatever). I can't imagine 
what this would be good for but it its possible.


I cannot imagine that the information in FSD would not be accepted in 
any of the wikimedia projects. I could be wrong though as I honestly did 
not visit or study the FSD very much.


--
Cheers Swedebugia



Re: Going through the bugs...

2018-12-18 Thread swedebugia

On 2018-12-18 15:37, Joshua Branson wrote:

swedebu...@riseup.net writes:


Hi

Here are more visual statistics:
https://debbugs.gnu.org/rrd/guix.html


How did you generate the graphs?



I did not. Debbugs does that automatically for all "packages".
replace guix with e.g. guix-patches

--
Cheers Swedebugia



Internet Archive APIs useful as fallback?

2018-12-18 Thread swedebugia

Hi

I stumbled over these at clintons blog and thought I would share them 
here if anybody is interested.


APIs for content other that way-back machine:
https://blog.archive.org/2018/12/13/documentation-for-public-apis-at-the-internet-archive/

APIs for the way-back machine:
https://archive.org/help/wayback_api.php

excerp:
Wayback Availability JSON API

This simple API for Wayback is a test to see if a given url is archived 
and currenlty accessible in the Wayback Machine. This API is useful for 
providing a 404 or other error handler which checks Wayback to see if it 
has an archived copy ready to display. The API can be used as follows:

http://archive.org/wayback/available?url=example.com

which might return:

{
"archived_snapshots": {
"closest": {
"available": true,
"url": 
"http://web.archive.org/web/20130919044612/http://example.com/;,

"timestamp": "20130919044612",
"status": "200"
}
}
}


--
Cheers Swedebugia



02/02: import: Update opam importer.

2018-12-18 Thread Eric Bavier
> commit cce654fabdf09cac7d18f9bad842ba8445aa022c
> Author: Julien Lepiller 
> Date:   Mon Dec 17 21:05:35 2018 +0100
> 
> import: Update opam importer.
> 
> * guix/import/opam.scm: Update importer for opam 2.
> * tests/opam.scm: Update tests for the opam 2 importer.
> ---
>  guix/import/opam.scm | 305 
> ---
>  po/guix/POTFILES.in  |   1 +
>  tests/opam.scm   | 225 +
>  3 files changed, 321 insertions(+), 210 deletions(-)
> 
> diff --git a/guix/import/opam.scm b/guix/import/opam.scm
> index f252bdc..c42a5d7 100644
> --- a/guix/import/opam.scm
> +++ b/guix/import/opam.scm
> @@ -17,132 +17,108 @@
>  ;;; along with GNU Guix.  If not, see .
>  
>  (define-module (guix import opam)
> +  #:use-module (ice-9 ftw)
>#:use-module (ice-9 match)
> -  #:use-module (ice-9 vlist)
> +  #:use-module (ice-9 peg)

This commit breaks Guix compatibility with Guile 2.0:

  $ guix build -K --with-commit=guile2.0-guix=cce654fabf guile2.0-guix
<...snip...>
  LOAD guix/import/opam.scm Backtrace:
In ice-9/r4rs.scm:
  90: 19 [dynamic-wind # ...]
In ice-9/eval.scm:
 432: 18 [eval # #]
 432: 17 [eval # #]
 481: 16 [lp (#) (#)]
In ice-9/boot-9.scm:
2900: 15 [resolve-interface (guix import opam) #:select ...]
2825: 14 [# # ...]
In ice-9/r4rs.scm:
  90: 13 [dynamic-wind # ...]
In ice-9/boot-9.scm:
3101: 12 [try-module-autoload (guix import opam) #f]
2412: 11 [save-module-excursion #]
3121: 10 [#]
In unknown file:
   ?: 9 [primitive-load-path "guix/import/opam" ...]
In ice-9/eval.scm:
 505: 8 [# (define-module # # 
...)]
In ice-9/psyntax.scm:
1107: 7 [expand-top-sequence ((define-module # # # ...)) () ((top)) ...]
 990: 6 [scan ((define-module (guix import opam) #:use-module ...)) () ...]
 279: 5 [scan ((#(syntax-object let # ...) (#) (# #) ...)) () ...]
In ice-9/eval.scm:
 411: 4 [eval # ()]
In ice-9/boot-9.scm:
2987: 3 [define-module* (guix import opam) #:filename ...]
2962: 2 [resolve-imports (((ice-9 ftw)) ((ice-9 match)) ((ice-9 peg)) ...)]
2903: 1 [resolve-interface (ice-9 peg) #:select ...]
In unknown file:
   ?: 0 [scm-error misc-error #f "~A ~S" ("no code for module" (ice-9 peg)) #f]

ERROR: In procedure scm-error:
ERROR: no code for module (ice-9 peg)
make[2]: *** [Makefile:5572: make-go] Error 1
make[2]: Leaving directory 
'/tmp/guix-build-guile2.0-guix-git.cce654f.drv-0/source'
make[1]: *** [Makefile:4653: all-recursive] Error 1
make[1]: Leaving directory 
'/tmp/guix-build-guile2.0-guix-git.cce654f.drv-0/source'
make: *** [Makefile:3269: all] Error 2


I'm hesitant to suggest that we just update our requirements to Guile
2.2 (though I know Ludovic wants to do that eventually), since I think
that would just makes things that much harder for anyone who wants to
build Guix from source on a foreign distro.  Debian stable does not
have Guile 2.2 (it's at Guile 2.0.13).

OTOH, I'm don't now how to keep this code working on Guile 2.0.13.

`~Eric


pgpx3s3ulD0jK.pgp
Description: OpenPGP digital signature


Packaging Terraform, a Golang package

2018-12-18 Thread Chris Marusich
Hi,

I'm trying to package Terraform, which is available under the Mozilla
Public License 2.0:

https://github.com/hashicorp/terraform

I'm not very familiar with how Golang programs are packaged, so I might
be missing some obvious things.  Please let me know if I am!

To begin, I thought I would first try to build it without creating a
package definition just yet.  Should be easy, right?

Clone the repo:

git clone https://github.com/hashicorp/terraform.git

Create a profile for hacking around that contains some tools that the
project seems to need:

guix package -i go make git bash grep findutils which -p .guix-profile

Make sure we're only using those dependencies:

eval "$(guix package --search-paths=exact --profile=$(realpath --no-symlinks 
.guix-profile))"

Enter the project directory:

cd terraform

Following the docs, run "make tools" to "install some required tools".
This downloads some stuff using "go get" (I'll probably have to put
those dependencies into the package definition later...):

--8<---cut here---start->8---
$ make tools
GO111MODULE=off go get -u golang.org/x/tools/cmd/stringer
GO111MODULE=off go get -u golang.org/x/tools/cmd/cover
GO111MODULE=off go get -u github.com/golang/mock/mockgen
--8<---cut here---end--->8---

Nice! Now, per the docs, we run make:

--8<---cut here---start->8---
$ make
==> Checking that code complies with gofmt requirements...
which: no stringer in (/home/marusich/terraform/.guix-profile/bin)
# We turn off modules for "go generate" because our downstream generate
# commands are not all ready to deal with Go modules yet, and this
# avoids downloading all of the deps that are in the vendor dir anyway.
GO111MODULE=off go generate ./...
2018/12/18 19:06:13 Generated command/internal_plugin_list.go
addrs/resource.go:256: running "stringer": exec: "stringer": executable file 
not found in $PATH
backend/operation_type.go:3: running "stringer": exec: "stringer": executable 
file not found in $PATH
backend/local/hook_count_action.go:3: running "stringer": exec: "stringer": 
executable file not found in $PATH
config/resource_mode.go:3: running "stringer": exec: "stringer": executable 
file not found in $PATH
configs/provisioner.go:121: running "stringer": exec: "stringer": executable 
file not found in $PATH
configs/configschema/schema.go:86: running "stringer": exec: "stringer": 
executable file not found in $PATH
helper/schema/resource_data_get_source.go:3: running "stringer": exec: 
"stringer": executable file not found in $PATH
plans/action.go:15: running "stringer": exec: "stringer": executable file not 
found in $PATH
plugin/mock_proto/generate.go:1: running "mockgen": exec: "mockgen": executable 
file not found in $PATH
states/instance_object.go:43: running "stringer": exec: "stringer": executable 
file not found in $PATH
states/statemgr/migrate.go:132: running "stringer": exec: "stringer": 
executable file not found in $PATH
terraform/context_graph_type.go:3: running "stringer": exec: "stringer": 
executable file not found in $PATH
tfdiags/diagnostic.go:20: running "stringer": exec: "stringer": executable file 
not found in $PATH
make: *** [Makefile:80: generate] Error 1
[2] marusich@garuda.local:~/terraform/terraform
$ 
--8<---cut here---end--->8---

It seems like it failed to build because some tools were missing from
PATH, specifically "stringer" and mockgen".  I've tried various methods
of installing these things, but I can't seem to get it working.  How do
I get "stringer" and "mockgen"?

I ran these commands, but it didn't seem to work:

--8<---cut here---start->8---
[2] marusich@garuda.local:~/terraform/terraform
$ go get github.com/golang/mock/gomock
go: finding github.com/hashicorp/hcl2 v0.0.0-20181215005721-253da47fd604
go: finding github.com/terraform-providers/terraform-provider-aws v1.52.0
go: finding github.com/aws/aws-sdk-go v1.16.4
go: finding github.com/zclconf/go-cty v0.0.0-20181218225846-4fe1e489ee06
go: finding github.com/boombuler/barcode v1.0.0
go: finding github.com/pquerna/otp v1.0.0
go: finding github.com/jmespath/go-jmespath v0.0.0-20180206201540-c2b33e8439af
go: finding github.com/golang/mock/gomock latest
go: downloading github.com/golang/mock v1.2.0
[0] marusich@garuda.local:~/terraform/terraform
$ go install github.com/golang/mock/mockgen
[0] marusich@garuda.local:~/terraform/terraform
$ go get -u -a golang.org/x/tools/cmd/stringer
go: finding golang.org/x/tools/cmd/stringer latest
go: finding golang.org/x/tools/cmd latest
go: finding golang.org/x/tools latest
go: downloading golang.org/x/tools v0.0.0-20181218204010-d4971274fe38
[0] marusich@garuda.local:~/terraform/terraform
$ go install golang.org/x/tools/cmd/stringer
[0] marusich@garuda.local:~/terraform/terraform
$ type -P mockgen
[1] marusich@garuda.local:~/terraform/terraform
$ type -P stringer
[1] 

Re: bioinformatics.scm vs bioconductor.scm ?

2018-12-18 Thread Ricardo Wurmus


zimoun  writes:

> Ok, but for example this convention about CRAN is not consistent with
> the importer. :-)
>   guix import cran corpcor -r
> fills the license field with (license gpl3+) and not (license license:gpl3+)

That’s right.  The importer does not know where the generated package
definition is supposed to be used.

> In other words, why the cran.scm needs a prefix for the license field?

It uses a prefix because we use the “zlib” package often, but not the
“zlib” license.  We could exclude the “zlib” license from (guix
licenses), or import only a specified list of licenses, or we can solve
this naming conflict by prefixing all values from (guix licenses) with
“license:” (or anything else, really).

Really small modules often don’t have this problem in the first place,
so they don’t need to find a solution to work around the naming
conflicts.

--
Ricardo




Re: bioinformatics.scm vs bioconductor.scm ?

2018-12-18 Thread Ricardo Wurmus
Hi,

> Is the bioconductor importer usable from `guix import` ?

yes.  You may encounter minor problems when using the recursive
bioconductor importer, as it may try to look up CRAN packages on
Bioconductor.

> This package is on Bioconductor:
> https://bioconductor.org/packages/release/bioc/html/flowCore.html

I’d do

./pre-inst-env guix import cran -a bioconductor -r flowCore

This fails because it wants corpcor from CRAN.  So we do:

./pre-inst-env guix import cran -r corpcor

We dump the result (with minor changes) in (gnu packages cran) and try
again to import flowCore.  This time it succeeds.

> Hum, the package BiocGenerics needs the version >= 0.1.14, and it is
> not defined in the package.

We have r-biocgenerics 0.28.0 in gnu/packages/bioinformatics.scm.
That’s one of the packages that should move eventually.

> Then, the package grDevices, graphics, methods, stats, stats4 are
> required (see bioconductor webpage) but not defined elsewhere. Is it
> good ?

These are all default packages that are part of R itself.  The importer
skips them.

> What is the convention about license ?
> (license name) or (license license:name)

This depends on the target module.  cran.scm, bioinformatics.scm, and
bioconductor.scm all use the “license:” prefix.  web.scm on the other
hand uses the “l:” prefix.  Take a look at the #:use-module clause at
the top of the module.

--
Ricardo




`guix lint' warn of GitHub autogenerated source tarballs

2018-12-18 Thread Arun Isaac


Now that we are avoiding GitHub autogenerated source tarballs since they
are unstable and cause hash mismatch errors, can we have `guix lint'
emit a warning if these autogenerated source tarballs are used?



Re: bioinformatics.scm vs bioconductor.scm ?

2018-12-18 Thread zimoun
Dear,

Thank you for your explanations.
And sorry if I am still slow to understand.

> > What is the convention about license ?
> > (license name) or (license license:name)
>
> Just about this point: This is not a "convention", this is part of the
> language definition of Guile, the underlying Scheme implementation:

I understand this. :-)

>
> In the module gnu/packes/cran.scm (and many others too) you find:
>
> (define-module (gnu packages cran)
>   #:use-module ((guix licenses) #:prefix license:)
>   #:use-module (guix packages)
> [...]
> )
>
> That means: use everything from module "guix licenses" and prefix it
> with "license:". So, in the cran module, you must use "license:name" to
> use the publicly defined "name" from the "guix licenses" module.

Ok, but for example this convention about CRAN is not consistent with
the importer. :-)
  guix import cran corpcor -r
fills the license field with (license gpl3+) and not (license license:gpl3+)

In other words, why the cran.scm needs a prefix for the license field?

> In other packages that import "guix licenses" without the prefix, you
> use "name" directly. See gnu/packages/scsi.scm for an example.

Ok.
But there is a convention or an explanation why some packages use a
prefix e.g. cran.scm and other not e.g scsi.scm?

Thank you again for your explanations.

Best regards,
simon



Re: bioinformatics.scm vs bioconductor.scm ?

2018-12-18 Thread Björn Höfling
On Tue, 18 Dec 2018 06:31:44 -0500
zimoun  wrote:

> What is the convention about license ?
> (license name) or (license license:name)

Just about this point: This is not a "convention", this is part of the
language definition of Guile, the underlying Scheme implementation:

In the module gnu/packes/cran.scm (and many others too) you find:

(define-module (gnu packages cran)
  #:use-module ((guix licenses) #:prefix license:)
  #:use-module (guix packages)
[...]
)

That means: use everything from module "guix licenses" and prefix it
with "license:". So, in the cran module, you must use "license:name" to
use the publicly defined "name" from the "guix licenses" module. 

In other packages that import "guix licenses" without the prefix, you
use "name" directly. See gnu/packages/scsi.scm for an example.

Björn



pgpjSVYfaYhre.pgp
Description: OpenPGP digital signature


Re: IPFS trouble

2018-12-18 Thread Hector Sanjuan
‐‐‐ Original Message ‐‐‐
On Tuesday, December 18, 2018 2:07 PM, Laura Lazzati 
 wrote:

> On Tue, Dec 18, 2018 at 10:00 AM Björn Höfling
> bjoern.hoefl...@bjoernhoefling.de wrote:
>
> > Hi Laura,
> > I'm sending this also to guix-devel [and sorry for the previous, empty,
> > private mail, I was too fast on the sending button].
> > Note: I'm also new to IPFS, so I hope everything is correct here, if
> > someone knows better, please reply.
> > On Tue, 18 Dec 2018 00:04:56 -0300
> > Laura Lazzati laura.lazzati...@gmail.com wrote:
> >
> > > > A good guide is:
> > > > https://medium.com/textileio/the-definitive-guide-to-publishing-content-on-ipfs-ipns-dfe751f1e8d0.
> >
> > > Sorry, I read the documentation, but I am mixed up.
> > > I have my peer identity, and my /ipfs/hash...
> > > And I find confusing several things:
> > > If I run ipfs add myfile, using my command line, I cannot find myfile
> > > in my node. even the add command returns a hash for that file I guess.
> > > And if I run ipfs ls or cat that hash, the file is shown.
> >
> > I don't get what you mean with "I cannot find my file". Where are you
> > looking for it?
> >
> > > I found useful for instance the webui, but when running ipfs add, my
> > > files don't appear there if I open it and the other way around either:
> > > the files are there if I upload them in the webui, and they have a
> > > hash. I can copy the hash from the webui, but it doesn't work if I run
> > > ipfs ls on that hash. And the files added with ipfs add myfile do not
> > > appear in the webui. The webui however has an option to share your
> > > file, I don't know if that is useful.
> >
> > You have in IPFS file-objects and directory-objects. The command 'ipfs
> > ls' is for listing the contents of directory-objects (i.e. list the
> > files in that directory). If you use it on file-objects (that exit in
> > the store), the command just returns with no output.
> > This works for me:
> > I have the daemon down, i.e. no 'ipfs daemon' started.
> > Then I can still add things to my local IPFS-store:
> >
> > mkdir baz
> > echo "foo" > baz/foo.txt
> > echo "bar" > baz/bar.txt
> > echo "Hello World, now it is $(date -u -Ins)" > baz/hello.txt
> > cat baz/hello.txt
> >
> > Hello World, now it is 2018-12-18T12:08:57,304514914+00:00
> >
> >
> > ipfs add -r baz/
> > added QmTz3oc4gdpRMKP2sdGUPZTAGRngqjsi99BPoztyP53JMM baz/bar.txt
> > added QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 baz/foo.txt
> > added QmXXZWRsLhFAHNWW6tH4TJVB2UiUPsUX8TZhYavqTne6RH baz/hello.txt
> > added QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg baz
> >  67 B / 67 B
> > [=] 100.00%
> >
> >
> > Now I can open the web-browser:
> >
> > localhost:9090/ipfs/QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
> >
> > --> (Note:I changed my port from default 8080 to 9090, on 8080 is
> > already something listening) Unable to connect, I don't have the daemon
> > up yet.
> >
> > https://ipfs.io/ipfs/QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
> >
> > --> Shows "foo", as this Hash is already uploaded in the global network
> > by someone else.
> >
> > https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
> >
> > --> That is with the hash of the directory. Times out, this is not found
> > on the global network.
> >
> >
> > Starting daemon:
> >
> > ipfs daemon
> >
> > Browser:
> >
> > localhost:5001/webui
> >
> > Redirects:
> >
> > 
> > http://localhost:5001/ipfs/QmSDgpiHco5yXdyVTfhKxr3aiJ82ynz8V14QcGKicM3rVh/#/
> >
> > Entering hash of "foo" in the "Explore" tab:
> >
> > QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
> >
> > Finds it, I can view it.
> >
> > Entering hash of "baz" directory:
> >
> > 
> > http://localhost:9090/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
> >
> > Yes, it lists the directory.
> >
> > Globally available:
> >
> > https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
> >
> > I can see the directory structure. And I can see the files foo.txt,
> > bar.txt and hello.txt listed:
> > 
> > https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/bar.txt
> >
> >
> > But, the "hello.txt" takes its time to download, until now I still
> > don't see it:
> >
> > 
> > https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/hello.txt
> >
> > Probably that's because it first needs to search the network and find
> > my little local host for that file. Hm. Strange.
> >
> > Ah, after 5 minutes, it's there! Maybe that's also your problem?
> >
> > Going to my server, daemon is down by default:
> >
> > myserver$ ipfs ls QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
> > Error: merkledag: not found
> >
> > myserver$ echo "foo" > foo.txt
> > myserver$ ipfs add foo.txt
> > added 

Re: bioinformatics.scm vs bioconductor.scm ?

2018-12-18 Thread zimoun
Dear Ricardo,

Thank you for your explanations.

> > And I am asking myself if a massive import from Bioconductor should be
> > possible ?
>
> Certainly!  I’ve done this before actually, but I hit two minor
> problems:
>
> 1. the bioconductor recursive importer does not *automatically* switch
>to “CRAN mode” when a dependent package isn’t found on Bioconductor.
>Not a big problem, but it means that teh import isn’t fully
>automatic.

I am not sure to understand.
Is the bioconductor importer usable from `guix import` ?

I have tried once by hand; to understand step by step. :-)
Thank Pierre for the nice tutorial !
see the package definition below.


> 2. compiling big Guile modules (such as a future (gnu packages cran))
>require lots of memory since Guile 2.2(?), so I didn’t add all these
>packages.  This is a bug and we’d have to split the module, probably,
>to work around it.

Ok, even if I have no clue to work around.



So as I need the FlowCore package to process data from cytometry, let
start my first attempt with this one. :-)

This package is on Bioconductor:
https://bioconductor.org/packages/release/bioc/html/flowCore.html
Then define the package by hand was straightforward! :-)
I am not sure to be compliant... Basically, I have just copied/pasted
the modules from bioinformatics.scm (or bioconductor.scm), then I have
look for if the dependency was already there. Missing r-corpcor from
CRAN, so `guix import cran`.

Hum, the package BiocGenerics needs the version >= 0.1.14, and it is
not defined in the package.
Then, the package grDevices, graphics, methods, stats, stats4 are
required (see bioconductor webpage) but not defined elsewhere. Is it
good ?

What is the convention about license ?
(license name) or (license license:name)

Well, then I have run guix package --install-from-file=my.scm and it
seems to works, I mean the first examples from the vignette do not
complain. ;-)


Thank you for any comment.


All the best
simon

--

(define-module (gnu packages my-bioinformatics)
  #:use-module ((guix licenses) #:prefix license:)
  #:use-module (guix packages)
  #:use-module (guix utils)
  #:use-module (guix download)
  #:use-module (guix git-download)
  #:use-module (guix hg-download)
  #:use-module (guix build-system ant)
  #:use-module (guix build-system gnu)
  #:use-module (guix build-system cmake)
  #:use-module (guix build-system haskell)
  #:use-module (guix build-system ocaml)
  #:use-module (guix build-system perl)
  #:use-module (guix build-system python)
  #:use-module (guix build-system r)
  #:use-module (guix build-system ruby)
  #:use-module (guix build-system scons)
  #:use-module (guix build-system trivial)
  #:use-module (gnu packages)
  #:use-module (gnu packages autotools)
  #:use-module (gnu packages algebra)
  #:use-module (gnu packages base)
  #:use-module (gnu packages bash)
  #:use-module (gnu packages bison)
  #:use-module (gnu packages bioconductor)
  #:use-module (gnu packages bioinformatics)
  #:use-module (gnu packages boost)
  #:use-module (gnu packages check)
  #:use-module (gnu packages compression)
  #:use-module (gnu packages cpio)
  #:use-module (gnu packages cran)
  #:use-module (gnu packages curl)
  #:use-module (gnu packages documentation)
  #:use-module (gnu packages databases)
  #:use-module (gnu packages datastructures)
  #:use-module (gnu packages file)
  #:use-module (gnu packages flex)
  #:use-module (gnu packages gawk)
  #:use-module (gnu packages gcc)
  #:use-module (gnu packages gd)
  #:use-module (gnu packages gtk)
  #:use-module (gnu packages glib)
  #:use-module (gnu packages graph)
  #:use-module (gnu packages groff)
  #:use-module (gnu packages guile)
  #:use-module (gnu packages haskell)
  #:use-module (gnu packages haskell-check)
  #:use-module (gnu packages haskell-web)
  #:use-module (gnu packages image)
  #:use-module (gnu packages imagemagick)
  #:use-module (gnu packages java)
  #:use-module (gnu packages jemalloc)
  #:use-module (gnu packages dlang)
  #:use-module (gnu packages linux)
  #:use-module (gnu packages logging)
  #:use-module (gnu packages machine-learning)
  #:use-module (gnu packages man)
  #:use-module (gnu packages maths)
  #:use-module (gnu packages mpi)
  #:use-module (gnu packages ncurses)
  #:use-module (gnu packages ocaml)
  #:use-module (gnu packages pcre)
  #:use-module (gnu packages parallel)
  #:use-module (gnu packages pdf)
  #:use-module (gnu packages perl)
  #:use-module (gnu packages perl-check)
  #:use-module (gnu packages pkg-config)
  #:use-module (gnu packages popt)
  #:use-module (gnu packages protobuf)
  #:use-module (gnu packages python)
  #:use-module (gnu packages python-web)
  #:use-module (gnu packages readline)
  #:use-module (gnu packages ruby)
  #:use-module (gnu packages serialization)
  #:use-module (gnu packages shells)
  #:use-module (gnu packages statistics)
  #:use-module (gnu packages swig)
  #:use-module (gnu packages tbb)
  #:use-module (gnu packages tex)
  

Re: Re-approaching package tagging

2018-12-18 Thread zimoun
Dear,

How the relevance is evaluated ?
And how the regexp works ?
Maybe, I miss the documentation in the manual.

I share the same feeling as Ludo about debtags.

What do you think about `aptitude search` User Interface?
https://www.debian.org/doc/manuals/aptitude/ch02s04s05.en.html

Thank you in advance for any pointer.

All the best,
simon



Re: GC Warning: Out of Memory

2018-12-18 Thread Rene
Hello,

>
> I’d really like to add the bootstrap binaries in master (like the patch
> you sent) and on alpha.gnu.org, but for that we’d need to figure out
> while Guile 2.2 (guile-static-stripped) currently fails to run on
> GNU/Hurd.
>

I'm doing tests with other versions of Guile, previously I remember that it 
worked `guile --version`.

Thanks



Re: IPFS trouble

2018-12-18 Thread Ricardo Wurmus


Hi Laura,

> Thank you! I will try it. I felt really silly for spending  too much
> time on that. I just want to share my videos so that you can see how
> they are, even they are videos for trying out the video/translation
> tools.

If you have a workflow for building the videos out of source files
(i.e. a Makefile or a shell script), it may be easier to share them.

--
Ricardo




Re: IPFS trouble

2018-12-18 Thread Laura Lazzati
Hi Björn - and guix -

Here is everything I tried

mkdir ipfsFiles
cd
cd videosWithoutTranslation/
cp audio-input-list3.txt /home/laura/ipfsFiles/
cp video-input-list3.txt /home/laura/ipfsFiles/
ipfs add -r ipfsFiles/
cd ipfsFiles
ipfs  init
initializing IPFS node at /home/laura/.ipfs
generating 2048-bit RSA keypair...done
peer identity: QmVJW3dAuoXaqeeHeVrsD7fJHhrAxoriR2JurtGNx3e8he
to get started, enter:

ipfs cat /ipfs/QmS4ustL54uo8FzR9455qaxZwuMiUhyvMcX9Ba8nUH4uVv/readme
cd
ipfs add -r ipfsFiles/
added QmVmtcXockAvmA9ZorgksmwK4PiSV4AoRx7KpoLczg2AD9
ipfsFiles/audio-input-list3.txt
added QmTtutKMVDCWtinAC8m1Xft4u4P84uCjPhqJ9amVswCSAe
ipfsFiles/video-input-list3.txt
added Qmd2aRaa6rSEkwU95Lk3RGtHp7oBzJyM97oxgTk7ENrVaB ipfsFiles
 115 B / 115 B 
[=]
100.00%

localhost:9090/ipfs/QmTtutKMVDCWtinAC8m1Xft4u4P84uCjPhqJ9amVswCSAe
 did not work either

ipfs config Addresses.Gateway /ip4/0.0.0.0/tcp/9090

https://ipfs.io/ipfs/QmTtutKMVDCWtinAC8m1Xft4u4P84uCjPhqJ9amVswCSAe
no one else has my hash

ipfs daemon
Initializing daemon...
go-ipfs version: 0.4.19-dev-
Repo version: 7
System version: amd64/linux
Golang version: go1.11.1
Successfully raised file descriptor limit to 2048.
Swarm listening on /ip4/10.0.2.15/tcp/4001
Swarm listening on /ip4/127.0.0.1/tcp/4001
Swarm listening on /ip6/::1/tcp/4001
Swarm listening on /p2p-circuit
Swarm announcing /ip4/10.0.2.15/tcp/4001
Swarm announcing /ip4/127.0.0.1/tcp/4001
Swarm announcing /ip6/::1/tcp/4001
API server listening on /ip4/127.0.0.1/tcp/5001
Gateway (readonly) server listening on /ip4/0.0.0.0/tcp/9090
Daemon is ready

went to webui:
In explore tab, inserting the hash of a file

After I don't know how much time I waited (I did not count it, but the
response time was ubearable, it was not 5 min at all)

CIDQmVmtcXockAvmA9ZorgksmwK4PiSV4AoRx7KpoLczg2AD9

Size48 B
Links0
Data
Object {type: "file", data: Buffer[40], blockSizes: Array[0]}

type: "file"
data: Buffer[40]
blockSizes: Array[0]

Doing the same with my directory:

CIDQmd2aRaa6rSEkwU95Lk3RGtHp7oBzJyM97oxgTk7ENrVaB
Size261 B
Links2
Data

Object {type: "directory", data: undefined, blockSizes: Array[0]}
type: "directory"

data: undefined
blockSizes: Array[0]

Path
CID 0
audio-input-list3.txt QmVmtcXockAvmA9ZorgksmwK4PiSV4AoRx7KpoLczg2AD9
1 video-input-list3.txt QmTtutKMVDCWtinAC8m1Xft4u4P84uCjPhqJ9amVswCSAe

without problems

But if I go to the files tab, I cannot see the files. I only have the
option of add file, that was what I was trying to write yesterday

https://ipfs.io/ipfs/Qmd2aRaa6rSEkwU95Lk3RGtHp7oBzJyM97oxgTk7ENrVaB
Waits forever ends up with about: blank, it is impossible to see it

in my terminal: ipfs ls QmVmtcXockAvmA9ZorgksmwK4PiSV4AoRx7KpoLczg2AD9
nothing shown

ipfs ls Qmd2aRaa6rSEkwU95Lk3RGtHp7oBzJyM97oxgTk7ENrVaB
(istantly, daemon always running but not in background)
QmVmtcXockAvmA9ZorgksmwK4PiSV4AoRx7KpoLczg2AD9 48 audio-input-list3.txt
QmTtutKMVDCWtinAC8m1Xft4u4P84uCjPhqJ9amVswCSAe 83 video-input-list3.txt

ipfs cat
ipfs: Reading from /dev/stdin; send Ctrl-d to stop.
^C
Error: Post http://127.0.0.1:5001/api/v0/cat?encoding=json=true:
context canceled

Is it OK to share my ipfs id and that hash of my directory so that you
see  my files?

Regards!
Laura



Re: Re-approaching package tagging

2018-12-18 Thread Christopher Lemmer Webber
Ludovic Courtès writes:

> I’m surprised you don’t mention --search, which is more appropriate than
> -A (‘-A’ is here only to search among package names):
>
> --8<---cut here---start->8---
> $ guix package -s roguelike | recsel -p name,relevance
> name: roguebox-adventures
> relevance: 7
>
> name: tome4
> relevance: 5
>
> name: crawl
> relevance: 5
>
> name: crawl-tiles
> relevance: 5
>
> name: cataclysm-dda
> relevance: 5
>
> name: angband
> relevance: 5
> --8<---cut here---end--->8---
>
> I’m very much in favor of improving ‘--search’ until we’re happy with
> the results it gives.
>
> WDYT?
>
> Ludo’.

I'm embarassed to say I didn't know about --search :)



Re: IPFS trouble

2018-12-18 Thread Laura Lazzati
On Tue, Dec 18, 2018 at 10:00 AM Björn Höfling
 wrote:
>
> Hi Laura,
>
> I'm sending this also to guix-devel [and sorry for the previous, empty,
> private mail, I was too fast on the sending button].
>
>
> Note: I'm also new to IPFS, so I hope everything is correct here, if
> someone knows better, please reply.
>
> On Tue, 18 Dec 2018 00:04:56 -0300
> Laura Lazzati  wrote:
>
>
> > > A good guide is:
> > > https://medium.com/textileio/the-definitive-guide-to-publishing-content-on-ipfs-ipns-dfe751f1e8d0.
>
> > Sorry, I read the documentation, but I am mixed up.
> > I have my peer identity, and my /ipfs/hash...
> > And I find confusing several things:
> > If I run ipfs add myfile, using my command line, I cannot find myfile
> > in my node. even the add command returns a hash for that file I guess.
> > And if I run ipfs ls or cat that hash, the file is shown.
>
> I don't get what you mean with "I cannot find my file". Where are you
> looking for it?
>
> > I found useful for instance the webui, but when running ipfs add, my
> > files don't appear there if I open it and the other way around either:
> > the files are there if I upload them in the webui, and they have a
> > hash. I can copy the hash from the webui, but it doesn't work if I run
> > ipfs ls on that hash. And the files added with ipfs add myfile do not
> > appear in the webui. The webui however has an option to share your
> > file, I don't know if that is useful.
>
>
>
> You have in IPFS file-objects and directory-objects. The command 'ipfs
> ls' is for listing the contents of directory-objects (i.e. list the
> files in that directory). If you use it on file-objects (that exit in
> the store), the command just returns with no output.
>
>
> This works for me:
>
> I have the daemon down, i.e. no 'ipfs daemon' started.
>
> Then I can still add things to my local IPFS-store:
>
> ```
> mkdir baz
> echo "foo" > baz/foo.txt
> echo "bar" > baz/bar.txt
> echo "Hello World, now it is $(date -u -Ins)" > baz/hello.txt
> cat baz/hello.txt
>
> Hello World, now it is 2018-12-18T12:08:57,304514914+00:00
>
>
> ipfs add -r baz/
> added QmTz3oc4gdpRMKP2sdGUPZTAGRngqjsi99BPoztyP53JMM baz/bar.txt
> added QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 baz/foo.txt
> added QmXXZWRsLhFAHNWW6tH4TJVB2UiUPsUX8TZhYavqTne6RH baz/hello.txt
> added QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg baz
>  67 B / 67 B
> [=] 100.00%
>
>
> Now I can open the web-browser:
>
> localhost:9090/ipfs/QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
>
> --> (Note:I changed my port from default 8080 to 9090, on 8080 is
> already something listening) Unable to connect, I don't have the daemon
> up yet.
>
> https://ipfs.io/ipfs/QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
>
> --> Shows "foo", as this Hash is already uploaded in the global network
> by someone else.
>
> https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
>
> --> That is with the hash of the directory. Times out, this is not found
> on the global network.
>
>
> Starting daemon:
>
> ipfs daemon
>
> Browser:
>
> localhost:5001/webui
>
> Redirects:
>
> http://localhost:5001/ipfs/QmSDgpiHco5yXdyVTfhKxr3aiJ82ynz8V14QcGKicM3rVh/#/
>
> Entering hash of "foo" in the "Explore" tab:
>
> QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
>
> Finds it, I can view it.
>
> Entering hash of "baz" directory:
>
> http://localhost:9090/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
>
> Yes, it lists the directory.
>
> Globally available:
>
> https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
>
> I can see the directory structure. And I can see the files foo.txt,
> bar.txt and hello.txt listed:
> https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/bar.txt
>
>
> But, the "hello.txt" takes its time to download, until now I still
> don't see it:
>
> https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/hello.txt
>
> Probably that's because it first needs to search the network and find
> my little local host for that file. Hm. Strange.
>
> Ah, after 5 minutes, it's there! Maybe that's also your problem?
>
> Going to my server, daemon is down by default:
>
> myserver$ ipfs ls QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
> Error: merkledag: not found
>
> myserver$ echo "foo" > foo.txt
> myserver$ ipfs add foo.txt
> added QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 foo.txt
>  4 B / 4 B
> [===] 100.00%
> myserver$ ipfs ls QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
> myserver$
>
> (i.e., no error, no output)
>
>
>
> myserver$ ipfs daemon &
>
> Looking for the directory:
>
> myserver$ ipfs ls QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
> [Waiting 3 minutes nothing happens, then:]
> QmTz3oc4gdpRMKP2sdGUPZTAGRngqjsi99BPoztyP53JMM 12 bar.txt
> QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 12 foo.txt
> QmXXZWRsLhFAHNWW6tH4TJVB2UiUPsUX8TZhYavqTne6RH 67 hello.txt
>
> myserver$ ipfs 

Re: Going through the bugs...

2018-12-18 Thread Joshua Branson
swedebu...@riseup.net writes:

> Hi
>
> Here are more visual statistics:
> https://debbugs.gnu.org/rrd/guix.html

How did you generate the graphs?

-- 
Joshua Branson
Sent from Emacs and Gnus



IPFS trouble

2018-12-18 Thread Björn Höfling
Hi Laura,

I'm sending this also to guix-devel [and sorry for the previous, empty,
private mail, I was too fast on the sending button].


Note: I'm also new to IPFS, so I hope everything is correct here, if
someone knows better, please reply.

On Tue, 18 Dec 2018 00:04:56 -0300
Laura Lazzati  wrote:


> > A good guide is:
> > https://medium.com/textileio/the-definitive-guide-to-publishing-content-on-ipfs-ipns-dfe751f1e8d0.
> >   

> Sorry, I read the documentation, but I am mixed up.
> I have my peer identity, and my /ipfs/hash...
> And I find confusing several things:
> If I run ipfs add myfile, using my command line, I cannot find myfile
> in my node. even the add command returns a hash for that file I guess.
> And if I run ipfs ls or cat that hash, the file is shown.

I don't get what you mean with "I cannot find my file". Where are you
looking for it?

> I found useful for instance the webui, but when running ipfs add, my
> files don't appear there if I open it and the other way around either:
> the files are there if I upload them in the webui, and they have a
> hash. I can copy the hash from the webui, but it doesn't work if I run
> ipfs ls on that hash. And the files added with ipfs add myfile do not
> appear in the webui. The webui however has an option to share your
> file, I don't know if that is useful.



You have in IPFS file-objects and directory-objects. The command 'ipfs
ls' is for listing the contents of directory-objects (i.e. list the
files in that directory). If you use it on file-objects (that exit in
the store), the command just returns with no output.


This works for me:

I have the daemon down, i.e. no 'ipfs daemon' started.

Then I can still add things to my local IPFS-store:

```
mkdir baz
echo "foo" > baz/foo.txt
echo "bar" > baz/bar.txt
echo "Hello World, now it is $(date -u -Ins)" > baz/hello.txt
cat baz/hello.txt 

Hello World, now it is 2018-12-18T12:08:57,304514914+00:00


ipfs add -r baz/
added QmTz3oc4gdpRMKP2sdGUPZTAGRngqjsi99BPoztyP53JMM baz/bar.txt
added QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 baz/foo.txt
added QmXXZWRsLhFAHNWW6tH4TJVB2UiUPsUX8TZhYavqTne6RH baz/hello.txt
added QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg baz
 67 B / 67 B
[=] 100.00%


Now I can open the web-browser:

localhost:9090/ipfs/QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6

--> (Note:I changed my port from default 8080 to 9090, on 8080 is
already something listening) Unable to connect, I don't have the daemon
up yet.

https://ipfs.io/ipfs/QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6

--> Shows "foo", as this Hash is already uploaded in the global network
by someone else.

https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg

--> That is with the hash of the directory. Times out, this is not found
on the global network.


Starting daemon:

ipfs daemon

Browser:

localhost:5001/webui

Redirects:

http://localhost:5001/ipfs/QmSDgpiHco5yXdyVTfhKxr3aiJ82ynz8V14QcGKicM3rVh/#/

Entering hash of "foo" in the "Explore" tab:

QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6

Finds it, I can view it.

Entering hash of "baz" directory:

http://localhost:9090/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg

Yes, it lists the directory.

Globally available:

https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg

I can see the directory structure. And I can see the files foo.txt,
bar.txt and hello.txt listed:
https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/bar.txt


But, the "hello.txt" takes its time to download, until now I still
don't see it:

https://ipfs.io/ipfs/QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/hello.txt

Probably that's because it first needs to search the network and find
my little local host for that file. Hm. Strange.

Ah, after 5 minutes, it's there! Maybe that's also your problem?

Going to my server, daemon is down by default:

myserver$ ipfs ls QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
Error: merkledag: not found

myserver$ echo "foo" > foo.txt
myserver$ ipfs add foo.txt 
added QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 foo.txt
 4 B / 4 B
[===] 100.00%
myserver$ ipfs ls QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6
myserver$

(i.e., no error, no output)



myserver$ ipfs daemon &

Looking for the directory:

myserver$ ipfs ls QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg
[Waiting 3 minutes nothing happens, then:]
QmTz3oc4gdpRMKP2sdGUPZTAGRngqjsi99BPoztyP53JMM 12 bar.txt
QmYNmQKp6SuaVrpgWRsPTgCQCnpxUYGq76YEKBXuj2N4H6 12 foo.txt
QmXXZWRsLhFAHNWW6tH4TJVB2UiUPsUX8TZhYavqTne6RH 67 hello.txt

myserver$ ipfs cat
QmZ9iMU1iKRpAs7dR7XTLGaYtkcYFn6EiMXRhqpk5jaeNg/hello.txt Hello World,
now it is 2018-12-18T12:08:57,304514914+00:00

(instantly!)

Hope this helps a bit,

Björn


pgpKKpxRc9rx2.pgp
Description: OpenPGP digital signature


Re: Re-approaching package tagging

2018-12-18 Thread Catonano
Il giorno mar 18 dic 2018 alle ore 08:48 Catonano  ha
scritto:

>
>
> Il giorno lun 17 dic 2018 alle ore 22:10 swedebugia 
> ha scritto:
>
>> Hi :)
>>
>> On 2018-12-17 20:01, Christopher Lemmer Webber wrote:
>> > Hello,
>> >
>> > In the past when we've discussed package tagging, I think Ludo' has been
>> > against it, primarily because it's a giant source of bikeshedding.  I
>> > agree that it's a huge space for bikeshedding... no space provides more
>> > bikeshedding than naming things, and tagging things is a many to many
>> > naming system.
>> >
>> > However, I will say that finding packages based on topical interest is
>> > pretty hard right now.  If I want to find all the available roguelikes:
>> >
>> > cwebber@jasmine:~$ guix package -A rogue
>> > hyperrogue10.5out gnu/packages/games.scm:3652:2
>> > roguebox-adventures   2.2.1   out gnu/packages/games.scm:1047:2
>> >
>> > Hm, that's strange, there's definitely more roguelikes that should show
>> > up than that!  A more specific search is even worse:
>> >
>> > cwebber@jasmine:~$ guix package -A roguelike
>> > cwebber@jasmine:~$
>> >
>> > What I should have gotten back:
>> >   - angband
>> >   - cataclysm-dda
>> >   - crawl
>> >   - crawl-tiles
>> >   - hyperrogue
>> >   - nethack
>> >   - roguebox-adventures
>> >   - tome4
>> >
>> > So I only got 1/4 of the entries I was interested in in my first query.
>> > Too bad!
>> >
>> > I get that we're opening up space for bikeshedding and *that's true*.
>> > But it seems like not doing so makes things hard on users.
>> >
>> > What do you think?  Is there a way to open the (pandora's?) box of tags
>> > safely?
>>
>> Yes and no.
>>
>> Pjotr and I have discussed this relating to biotech software. He said
>> that many scientists have a hard time finding the right tools for the job.
>>
>> I proposed tight integration with wikidata[1] (every software in the
>> world will eventually have an item there) and Guix (QID on every package
>> and lookup/catogory integration) and leave all the categorizing to them.
>> Ha problem sidestepped, they are bikeshedding experts over there in
>> wikiland! :D
>>
>> The advantage of this is that everyone using wikidata (every package
>> manager) could pull the same categorization so we only do it once in a
>> central
>>
>> What do you think?
>>
>> --
>>
>>
>
> There is also the Free Software Directory
> https://directory.fsf.org/wiki/Main_Page
>
> I don't know what the relationship between Wikidata and the FSD is
>
> Does Wikidata import data from the FSD ? Or viceversa ?
>


There happens to be this thread on their mailing list, I think it's
relevant (the whole thread is interesting)

https://lists.gnu.org/archive/html/directory-discuss/2018-11/msg0.html


Re: Re-approaching package tagging

2018-12-18 Thread Ludovic Courtès
Hello,

Christopher Lemmer Webber  skribis:

> In the past when we've discussed package tagging, I think Ludo' has been
> against it, primarily because it's a giant source of bikeshedding.  I
> agree that it's a huge space for bikeshedding... no space provides more
> bikeshedding than naming things, and tagging things is a many to many
> naming system.

The reason I’m unconvinced about tags is that I used to be a big fan of
them, back when debtags was introduced (long ago!), but then I had to
face reality: people (me included) would just do plain text searches,
not sophisticated tag queries.

> However, I will say that finding packages based on topical interest is
> pretty hard right now.  If I want to find all the available roguelikes:
>
> cwebber@jasmine:~$ guix package -A rogue
> hyperrogue10.5out gnu/packages/games.scm:3652:2
> roguebox-adventures   2.2.1   out gnu/packages/games.scm:1047:2

I’m surprised you don’t mention --search, which is more appropriate than
-A (‘-A’ is here only to search among package names):

--8<---cut here---start->8---
$ guix package -s roguelike | recsel -p name,relevance
name: roguebox-adventures
relevance: 7

name: tome4
relevance: 5

name: crawl
relevance: 5

name: crawl-tiles
relevance: 5

name: cataclysm-dda
relevance: 5

name: angband
relevance: 5
--8<---cut here---end--->8---

I’m very much in favor of improving ‘--search’ until we’re happy with
the results it gives.

WDYT?

Ludo’.



Re: GC Warning: Out of Memory

2018-12-18 Thread Ludovic Courtès
Hello,

Rene  skribis:

>>
>> Rene are you still using the binaries I had provided?
>>
>
> I use the binaries generated from Guix(master or core-updates). And then I 
> use the attached patch, to update the binary hashes.

I’d really like to add the bootstrap binaries in master (like the patch
you sent) and on alpha.gnu.org, but for that we’d need to figure out
while Guile 2.2 (guile-static-stripped) currently fails to run on
GNU/Hurd.

Thanks,
Ludo’.



Re: video status

2018-12-18 Thread Ludovic Courtès
Hi Laura,

Laura Lazzati  skribis:

> I am writing a quick mail to tell you about the status of the videos.
> I have been trying Ricardo's proposed tools that are in the
> libreplanet site, and have created videos with audio and slides first
> in English, and then translated previous slides from a presentation
> adding to them only the translated slides, not the translated audios.
> I will be creating a Makefile to automate this these days, as well as
> looking at the tools for adding subtitles.
> The videos are short, they have just three slides and are not about
> the documentation stuff. I will probably be writing both here or over
> the IRC channel if I am blocked with sth related to technical stuff.
> Also feel free to share your ideas, also about the tools.

Thanks for the update.  It sounds like you’re on the right track and
it’s exciting to see this happening!

Ludo’.