Re: [julia-users] Re: quadgk with 2 arguments

2016-10-18 Thread Mosè Giordano
Hi Steven,

2016-10-19 0:36 GMT+02:00 Steven G. Johnson :
> For example:
>
>  quadgk(z -> 1/z, 1, 1im, -1, -1im)
>
> integrates 1/z over a closed counter-clockwise diamond-shaped contour around
> the origin in the complex plane, returning 2πi by the residue theorem.

Did you mean

 quadgk(z -> 1/z, 1, 1im, -1, -1im, 1)

?  The integral you showed is not over a closed contour and evaluates
to 3//2*pi*im.

In any case, I have to admit that quadgk is much more powerful than
what I expected, at least because purely implemented in Julia, so can
work with any Julia type.

Bye,
Mosè


[julia-users] Re: quadgk with 2 arguments

2016-10-16 Thread Mosè Giordano

>
> Btw: Can quadgk also be used for complex functions? (the integration is 
> still over a real range)
>

Just try it:

julia> quadgk(cis, 0, pi/2)
(1.0 + 0.im,2.482534153247273e-16)

 However for more advanced integration (in particular for multidimensional 
integration and vector-valued functions) you may want to use external 
packages, like Cuba.jl  and Cubature.jl 
.

Bye,
Mosè


Re: [julia-users] Re: julia-i18n: Translators and reviewer needed!

2016-10-07 Thread Mosè Giordano
Hi Ismael,

I can confirm the problems in
http://julialang.org/downloads/platform.html are now fixed and I added
the missing string.  Thank you!

However I just noticed that the description of Gadfly in
http://julialang.org/downloads/plotting.html is back in English (this
happens for both Italian and Spanish, for the same strings.  This was
also the case the the platform page).

Bye,
Mosè

2016-10-05 17:21 GMT+02:00 Ismael Venegas Castelló:
> I have re applied the translations, just so everybody knows, translations
> are never, ever deleted, even if the original strings are somehow marked as
> ignored, even if the website endpoint is deleted, translations always stay
> stored. Please check out that page now, it should be fixed:
>
>
> In the image above, you can see that this particular string changed, there
> is no mention about Snow Leopard anymore, when an update like this happens,
> you will see it as "untranslated" but if you pay more attention you'll see
> it has a suggestion that matches almost fully. In this case I just reused
> your translated string but removed the Snow Leopard mention from the
> translation and then clicked "save" again.
>
>
>
>
> In this other image, you can see that you have a new string, this doesn't
> have any prior suggestions, so it must be a new string, that was added to
> this page, your current progress without this string translated is 97%, you
> just need to translate it and you'll have 100% in this page again, if you
> have any more doubts, suggestion or you want to report anything else, please
> do not hesitate to do so. I has been very busy which is why I was late to
> answer your concern, have a very good day!


[julia-users] Re: julia-i18n: Translators and reviewer needed!

2016-10-02 Thread Mosè Giordano
Hi,

last week I completely translated 
http://julialang.org/downloads/platform.html to Italian (or at least I 
thought I did it), but now most of the translations disappeared, without 
recent changes to 
https://github.com/JuliaLang/julialang.github.com/blob/master/downloads/platform.md.
  
What happened to them?  Did someone else notice similar phenomena?  I'd not 
be very happy to translate the same strings over and over again just 
because they disappear for some random reasons.

Bye,
Mosè


Re: [julia-users] Re: Idea: Julia Standard Libraries and Distributions

2016-09-14 Thread Mosè Giordano
2016-09-15 0:10 GMT+02:00 Chris Rackauckas:
> If you use Reexport.jl's @reexport in the top-scope of a module for such a
> metapackage it will actually export all of the functions from inside:
>
> module MyMetapackage # Setup t
>   using Reexport
>   @reexport using DifferentialEquations
>   @reexport using Plots
> end
>
>
>
> using MyMetapackage # Imports both packages
> plot # 3 methods
> solve # 7 methods

Nice!  Then I'd vote in favor of meta-packages, rather than distributions.

It seems we basically came to something already proposed almost two
years ago: https://github.com/JuliaLang/julia/issues/1986#issuecomment-67957609
:-)

Bye,
Mosè


Re: [julia-users] Re: Idea: Julia Standard Libraries and Distributions

2016-09-14 Thread Mosè Giordano
2016-09-14 23:46 GMT+02:00 Chris Rackauckas:
> I too am weary about how different distributions would play together. I'd
> assume you'd just install with one and that would be it.

There are many people working in many fields that may want to install
different distributions.  I don't think that preventing them to
install all of them, or forcing them to choose only one, is a good
idea.  Creating many cross-field distributions is not viable.  Some
people may discard the idea of installing a distribution just for
these reasons.

> And yes, not
> putting "using" in code that you share can lead to some issues, so maybe
> "using MyMetapackage" and having the repo online so people can see all of
> your packages (at that date/time) would be helpful?

I think this works better with a meta-package, doesn't it?  The
problem is that functions are not exported.

Bye,
Mosè


[julia-users] Re: Idea: Julia Standard Libraries and Distributions

2016-09-14 Thread Mosè Giordano
Hi Chris,

what would be the difference between a distribution and a meta-package like 
those used, for example, in Debian and derivatives: a package without code, 
that only requires other packages?  In this sense you can create right now 
a meta-package: just create a repository for a Julia package and populate 
the REQUIRE file as you wish.  You can specify a Julia version and provide 
a build script.

I'm above all worried about the interplay between different distributions.  
There would be clashing .juliarc's?  Why providing a .juliarc at all?  
Automatically running at startup some "using"s & co. is definitely useful, 
but can lead in the long run to bad habits, like forgetting to put "using" 
commands in code shared with people not using the same distribution.

Bye,
Mosè

I think one major point of contention when talking about what should be 
> included in Base due to competing factors:
>
>
>1. Some people would like a "lean Base" for things like embedded 
>installs or other low memory applications
>2. Some people want a MATLAB-like "bells and whistles" approach. This 
>way all the functions they use are just there: no extra packages to 
>find/import.
>3. Some people like having things in Base because it "standardizes" 
>things. 
>4. Putting things in Base constrains their release schedule. 
>5. Putting things in packages outside of JuliaLang helps free up 
>Travis.
>
>
> The last two concerns have been why things like JuliaMath have sprung up 
> to move things out of Base. However, I think there is some credibility to 
> having some form of standardization. I think this can be achieved through 
> some kind of standard library. This would entail a set of packages which 
> are installed when Julia is installed, and a set of packages which add 
> their using statement to the .juliarc. To most users this would be 
> seamless: they would install automatically, and every time you open Julia, 
> they would import automatically. There are a few issues there:
>
>
>1.  This wouldn't work with building from source. This idea works 
>better for binaries (this is no biggie since these users are likely more 
>experienced anyways)
>2. Julia would have to pick winners and losers.
>
> That second part is big: what goes into the standard library? Would all of 
> the MATLAB functions like linspace, find, etc. go there? Would the sparse 
> matrix library be included?
>
> I think one way to circumvent the second issue would be to allow for Julia 
> Distributions. A distribution would be defined by:
>
>
>1. A Julia version
>2. A List of packages to install (with versions?)
>3. A build script
>4. A .juliarc
>
> The ideal would be for one to be able to make an executable from those 
> parts which would install the Julia version with the specified packages, 
> build the packages (and maybe modify some environment variables / 
> defaults), and add a .juliarc that would automatically import some packages 
> / maybe define some constants or checkout branches. JuliaLang could then 
> provide a lean distribution and a "standard distribution" where the 
> standard distribution is a more curated library which people can fight 
> about, but it's not as big of a deal if anyone can make their own. This has 
> many upsides:
>
>
>1. Julia wouldn't have to come with what you don't want.
>2. Other than some edge cases where the advantages of Base come into 
>play (I don't know of a good example, but I know there are some things 
>which can't be defined outside of Base really well, like BigFloats? I'm 
> not 
>the expert on this.), most things could spawn out to packages without the 
>standard user ever noticing.
>3. There would still be a large set of standard functions you can 
>assume most people will have.
>4. You can share Julia setups: for example, with my lab I would share 
>a distribution that would have all of the JuliaDiffEq packages installed, 
>along with Plots.jl and some backends, so that way it would be "in the box 
>solve differential equations and plot" setup like what MATLAB provides. I 
>could pick packages/versions that I know work well together, 
>and guarantee their install will work. 
>5. You could write tutorials / run workshops which use a distribution, 
>knowing that a given set of packages will be available.
>6. Anyone could make their setup match yours by looking at the 
>distribution setup scripts (maybe just make a base function which runs 
> that 
>install since it would all be in Julia). This would be nice for some work 
>in progress projects which require checking out master on 3 different 
>packages, and getting some weird branch for another 5. I would give you a 
>succinct and standardized way to specify an install to get there.
>
>
> Side notes:
>
> [An interesting distribution would be that JuliaGPU could provide a full 
> distribution for wh

[julia-users] [ANN] Deconvolution.jl: library for deconvolution of digital signals

2016-09-08 Thread Mosè Giordano
Hi all,

I'm pleased to announce a new package: Deconvolution.jl 
.  It is part of the JuliaDSP 
 project and aims at providing a set of tools 
to deconvolve digital signals.  The package is released under the terms of 
MIT "Expat" License  and its documentation is available at 
http://deconvolutionjl.readthedocs.io/. 


Currently the package provides only one method, the Wiener deconvolution 
, but more will 
hopefully come in the future.  I will maintain the package (take care of 
bug reports and pull requests), but I do not plan to develop it because 
this is not my field of interest.  If someone is interested in taking over 
the package, please open an issue on GitHub.

Happy deconvolution,
Mosè


[julia-users] Re: How to shuffle the order of matrix by rows

2016-09-07 Thread Mosè Giordano
Hi Ahmed,


Hello,
> l have a matrix as follow :
>  k=rand(1:100,3,3)
> 3x3 Array{Int64,2}:
>  97  26  77
>  58  64  86
>  87  85  99
>
>
> l want to shuffle randomly the order by lines for instance line 1 becomes 
> 3. l have a matrix of 1 by 700 .
>

 The following function is based on the definition of shuffle! in Julia 0.4 
(it's different in Julia 0.5)

function shufflerows!{T<:Real}(a::AbstractArray{T})
for i = size(a, 1):-1:2
j = rand(1:i)
a[i,:], a[j,:] = a[j,:], a[i,:]
end
return a
end

Here is it in action:

julia> a = rand(1:100, 5, 3)
5x3 Array{Int64,2}:
 18  90  34
  9  24  61
 49  73  84
 70  85  20
 66  59  37

julia> shufflerows!(a)
5x3 Array{Int64,2}:
  9  24  61
 18  90  34
 70  85  20
 49  73  84
 66  59  37

Note that the matrix is changed in-place.

Cheers,
Mosè


Re: [julia-users] AutoGrad port for Julia

2016-08-29 Thread Mosè Giordano
Hi Tom,


Mose: what version of julia are you on? Anonymous functions and closures 
> are much faster on 0.5... In fact there should be no performance penalty vs 
> regular functions, which allows you to rethink your paradigm. 
>

It was Julia 0.4.6, but I get similar results also with Julia 0.6-dev.72:

julia> using BenchmarkTools, AutoGrad

julia> COS = grad(sin)
(::gradfun) (generic function with 1 method)

julia> @benchmark cos(0.0)
BenchmarkTools.Trial: 
  samples:  1
  evals/sample: 1000
  time tolerance:   5.00%
  memory tolerance: 1.00%
  memory estimate:  0.00 bytes
  allocs estimate:  0
  minimum time: 6.00 ns (0.00% GC)
  median time:  6.00 ns (0.00% GC)
  mean time:6.03 ns (0.00% GC)
  maximum time: 19.00 ns (0.00% GC)

julia> @benchmark COS(0.0)
BenchmarkTools.Trial: 
  samples:  1
  evals/sample: 1
  time tolerance:   5.00%
  memory tolerance: 1.00%
  memory estimate:  4.05 kb
  allocs estimate:  71
  minimum time: 23.07 μs (0.00% GC)
  median time:  24.27 μs (0.00% GC)
  mean time:25.36 μs (1.63% GC)
  maximum time: 4.23 ms (97.76% GC)

Bye,
Mosè


[julia-users] Re: AutoGrad port for Julia

2016-08-28 Thread Mosè Giordano
Hi Deniz,

Announcing AutoGrad.jl : an 
> automatic differentiation package for Julia. It is a Julia port of the 
> popular Python autograd  package. It 
> can differentiate regular Julia code that includes loops, conditionals, 
> helper functions, closures etc. by keeping track of the primitive 
> operations and using this execution trace to compute gradients. It uses 
> reverse mode differentiation (a.k.a. backpropagation) so it can efficiently 
> handle functions with array inputs and scalar outputs. It can compute 
> gradients of gradients to handle higher order derivatives.
>
> Large parts of the code are directly ported from the Python autograd 
>  package. I'd like to thank autograd 
> author Dougal Maclaurin for his support. See (Baydin et al. 2015) 
>  for a general review of automatic 
> differentiation, autograd tutorial 
>  for some 
> Python examples, and Dougal's PhD thesis for design principles. JuliaDiff 
>  has alternative differentiation tools for 
> Julia.
>
> best,
> deniz
>
>
Very nice package!  It's one of the most complete yet easy-to-use automatic 
differentiation packages I've checked out.  Some comments below.

I expected (or hoped) that performance of the function returned by "grad" 
was comparable to that of the actual derivative/gradient.  However:

julia> using BenchmarkTools, AutoGrad

julia> COS = grad(sin)
gradfun (generic function with 1 method)

julia> @benchmark cos(0.0)
BenchmarkTools.Trial: 
  samples:  1
  evals/sample: 1000
  time tolerance:   5.00%
  memory tolerance: 1.00%
  memory estimate:  0.00 bytes
  allocs estimate:  0
  minimum time: 4.00 ns (0.00% GC)
  median time:  4.00 ns (0.00% GC)
  mean time:4.05 ns (0.00% GC)
  maximum time: 15.00 ns (0.00% GC)

julia> @benchmark COS(0.0)
BenchmarkTools.Trial: 
  samples:  1
  evals/sample: 1
  time tolerance:   5.00%
  memory tolerance: 1.00%
  memory estimate:  3.61 kb
  allocs estimate:  78
  minimum time: 9.14 μs (0.00% GC)
  median time:  10.30 μs (0.00% GC)
  mean time:10.98 μs (3.46% GC)
  maximum time: 3.88 ms (98.04% GC)


I found similar results for other simple functions.  Is this to be expected 
or is there room for improvements?

I saw that the derivative dictionaries for most special functions are 
incomplete.  You can give a look at the derivatives in file src/math.jl 
 
of my Measurements.jl  package 
(I could use an automatic differentiation library in my package, but I've 
never found one that suited my needs).  Look at the "result" calls returned 
by the overloaded functions, the second argument is the derivative or tuple 
of derivatives.

I noticed that you marked the derivative of airyprime as wrong.  I guess 
that you've been deceived by the misleading docstring of airy: 
https://github.com/JuliaLang/julia/issues/17032  The second derivatives of 
airy functions can be computed exploiting the Airy equation 
.

Cheers,
Mosè


[julia-users] Wiener deconvolution

2016-08-28 Thread Mosè Giordano
Hi all,

I wrote a very simple implementation of the Wiener deconvolution 
 in Julia.  You can 
find it here as a (non-registered) package: 
https://github.com/giordano/Wiener.jl  The README.md has a basic example of 
use.

I'm not sure it deserves to be a package on its own, so I'm willing to 
include it in another package, but which?  The Wiener deconvolution is 
often used in image processing, a good place could be Images.jl (which 
already has other tools to manipulate images), but this technique can be 
applied to any kind of signal, not just images, thus another option could 
be DSP.jl.  Please, suggest me where this code can go.

Cheers,
Mosè


[julia-users] Re: [ANN] LombScargle.jl: compute periodogram for unevenly sampled data

2016-08-19 Thread Mosè Giordano
Hi all,

an update about the situation of LombScargle.jl
<https://github.com/giordano/LombScargle.jl>

2016-07-18 10:15 GMT+02:00 Mosè Giordano:

> In the future I may implement another much-faster Lomb-Scargle algorithm
> by Press & Rybicki (1989, ApJ, 338, 277), which however requires the data
> to be equally sampled (but in this case also the FFT can be used).
>

I managed to implement this method, so now users can choose between three
different implementations of the Lomb-Scargle periodogram:

   - the original algorithm, that doesn't deal with uncertainties in the
   data nor for a non-null mean of the signal (based on Townsend, 2010, ApJS,
   191, 247)
   - the so called generalised Lomb-Scargle algorithm, that takes into
   account uncertainties and a non-zero mean of the signal (based on
   Zechmeister, Kürster, 2009, A&A, 496, 577)
   - an approximation of the generalised Lomb-Scargle algorithm, made very
   fast (and with lower computational complexity than the true Lomb-Scargle
   algorithm) by some tricks suggested by Press & Rybicki, 1989, ApJ, 338,
   277.  The only downside is that this method requires the data to be equally
   sampled.  In the package, this is controlled by the type of the time
   vector: if it's a Range, this fast method is used (but can be opted-out
   with a keyword)

Other notable features introduced since the first release:

   - you can choose between 7 different normalizations
   - functions to compute the false-alarm probability
   - a function, findmaxpower, to quickly find the maximum power value of
   the periodogram
   - in findmaxfreq it is now possible to restrict the search for the
   frequency with the highest power to a specific range, instead of using the
   whole periodogram
   - a new function, LombScargle.model, to get the Lomb-Scargle model that
   best fits your data at a given frequency (without performing an actual
   periodogram, because this is done just for one frequency)

The complete manual, with some examples complemented with pictures, is
available at http://lombscarglejl.readthedocs.io/ (here
<http://readthedocs.org/projects/lombscarglejl/downloads/pdf/latest/> the
PDF version).

The latest version of LombScargle.jl
<https://github.com/giordano/LombScargle.jl> is v0.1.1, be sure to update
your package list with Pkg.update() before installing it, if you want to
try it out.

Cheers,
Mosè


[julia-users] Re: overwrite function value

2016-08-11 Thread Mosè Giordano

>
> so when I define
>
> function f!(x)
> x=x+1
> end
>
> and  then
> x=1
> f!(x)
> I get 2 returned but I also want x to be replaced by the newly calculated 
> value :/
>

There is an entry of the FAQ about this: 
http://docs.julialang.org/en/stable/manual/faq/#i-passed-an-argument-x-to-a-function-modified-it-inside-that-function-but-on-the-outside-the-variable-x-is-still-unchanged-why

Bye,
Mosè


[julia-users] Re: [ANN] Cuba.jl: library for multidimensional numerical integration

2016-08-11 Thread Mosè Giordano
Another relevant update for this package: Cuba.jl can now be used on 
Windows (GNU/Linux and Mac OS were already supported).  I managed to cross 
compile the shared library for Windows on GNU/Linux, so during building of 
the package a prebuilt dll is downloaded for Windows users.

If you want to check out Cuba.jl, remember to issue Pkg.update() before 
installing it, in order to get the latest version (there have been two 
releases today, v0.1.3 being the latest one).

The only open issue is support for parallelization (
https://github.com/giordano/Cuba.jl/issues/1), a feature provided by the 
native Cuba library in C, which however doesn't seem to play nicely with 
Julia because of the use of fork() function.

Cheers,
Mosè


[julia-users] [ANN] LombScargle.jl: compute periodogram for unevenly sampled data

2016-07-18 Thread Mosè Giordano
Dear all,

I'm pleased to announce the first release of LombScargle.jl
, a package to compute the
Lomb-Scargle periodogram
.
Differently from standard FFT, this can be used to find periodicities in
unevenly sampled data, which is a fairly common case in astronomy, a field
where this periodogram is widely used.

The README.md has some examples of use, in addition a manual is available
at http://lombscarglejl.readthedocs.io/

The package implements the standard Lomb-Scargle periodogram that doesn't
take into account a non-null mean of the signal (but it is possible to
automatically subtract the average of the signal from the signal itself,
and this is the default), and the generalised Lomb-Scargle algorithm which
instead can deal with a non-null mean.

Relevant papers on this topic are:

   - Townsend, R. H. D. 2010, ApJS, 191, 247 (URL:
   http://dx.doi.org/10.1088/0067-0049/191/2/247, Bibcode:
   http://adsabs.harvard.edu/abs/2010ApJS..191..247T)
   - Zechmeister, M., Kürster, M. 2009, A&A, 496, 577 (URL:
   http://dx.doi.org/10.1051/0004-6361:200811296, Bibcode:
   http://adsabs.harvard.edu/abs/2009A%26A...496..577Z)

In the future I may implement another much-faster Lomb-Scargle algorithm by
Press & Rybicki (1989, ApJ, 338, 277), which however requires the data to
be equally sampled (but in this case also the FFT can be used).

In order to test and benchmark the results of LombScargle.jl I compared the
result with those of equivalent methods

provided by Astropy package.  Running Julia 0.5 I found that the standard
Lomb-Scargle periodogram as implemented in LombScargle.jl is ~40% and ~65
faster than the "scipy" and "cython" methods of Astropy, respectively
(they're both in Cython, not pure Python).  Instead, the generalised
Lomb-Scargle periodogram in LombScargle.jl is ~25% faster than the "cython"
method in Astropy.

The LombScargle.jl package is licensed under the MIT “Expat” License.

Bye,
Mosè


[julia-users] Re: ANN: New Julia Packages website

2016-07-12 Thread Mosè Giordano
Hi Adrian,

nice website!

What I'd like to have in a Julia packages website is categories.  This 
would greatly enhances the possibilities for the users to find the package 
they're looking for.  Currently one must use search strings, but they may 
not be very effective if the package author didn't use the exact words one 
is using in the search.  Of course this requires help from package 
authors.  I'm using a "keywords" cookie in the package comments like the 
one suggested in Emacs Lisp conventions: 
https://www.gnu.org/software/emacs/manual/html_node/elisp/Library-Headers.html#Library-Headers
  
Maybe something similar can be implemented in METADATA.jl and Pkg.generate 
could accept a category list as argument.  We can choose a set of 
"official" keywords that are listed in Julia packages websites.  I hope 
this will improve discoverability of packages.

Bye,
Mosè


I've setup an early version of a Julia packages website, for your package 
> discovery pleasure: http://genieframework.com/packages 
>
> Fair warning, this is a test case website for Genie.jl, the full stack web 
> framework I'm working on - and 90% of my focus was on building the actual 
> framework and the app, rather than the accuracy of the data. 
>
> That being said, the app works quite well as far as I can tell (feedback 
> welcome!) and compared to pkg.julialang.org it has a few extra features:  
> * full text search in README 
> * it includes both METADATA registered packages and extra packages crawled 
> from GitHub (not all Julia packages on GitHub are included, this is a know 
> bug and I'm working on fixing it - but all the official packages are 
> there). 
> * lots of info at a glance, to help spot the best packages
> * modern UI
>
> If the core contributors (of whoever's maintaining pkg.julialang.org) 
> think this can be a useful replacement for pkg.julialang.org I'm happy to 
> donate it and contribute by extending it to add the missing features 
> (license, tests status, etc) and maintain it. Let me know. 
>
> Adrian
>


[julia-users] Re: A Very Simple Benchmark for Brutal-force Loops in Several Languages: revised, Julia is fast!

2016-07-11 Thread Mosè Giordano
Hi Zhong,

you may want to check out Julia 0.5, on my box your benchmark is ~13% 
faster with Julia 0.5: 3.367658 seconds with Julia 0.4.6 and 2.898068 
seconds with Julia 0.5.


Bye,
Mosè


Hi,
>
> Sorry I have to post a revision so quickly. Just after I posted my 
> previous benchmark, I found the message on the top advising against using 
> global variables when running performance test. I realized I did exactly 
> that, so I went back to redo my test wrapped in a function. And that did 
> the magic! The difference is a ~18x speedup!
>
> I didn't know how to recall my previous post, so I have to post a revised 
> one (attached PDF) here. Since there is no surprise finding about Julia 
> anymore, the benchmark results are not that interesting now; but I have to 
> make sure the information I posted is not misleading, so here you go.
>
> Cheers,
> -Zhong
>
>

[julia-users] Re: [ANN] Measurements.jl v0.1.1: Error propagation library

2016-06-27 Thread Mosè Giordano
Hi,

2016-06-27 4:32 GMT+02:00 g:
>
> Hello,
> This looks very nice. I will use it next time I have a need. 
>

Thank you :-)
 

> I took a brief look at the code, and I have one question. Why did you 
> choose to "tag" your Measurement types with a random Float64, instead of an 
> incrementing integer?
>

How would you do it?  With a global variable?  I didn't benchmark it, but 
I'm worried about performances.  In addition, this causes a race condition 
when parallelization is enabled, doesn't it?

Bye,
Mosè


[julia-users] [ANN] Measurements.jl v0.1.1: Error propagation library

2016-06-26 Thread Mosè Giordano
Hi all,

I'm really happy to announce availability of Measurements.jl 
, v0.1.1, a full-fledged 
uncertainty 
propagation  
library.  This is the list of features:

   - Support for most mathematical operations available in Julia involving 
   real and complex numbers. All existing functions that accept 
   AbstractFloat (and Complex{AbstractFloat}) arguments and internally use 
   already supported functions can in turn perform calculations involving 
   numbers with uncertainties without being redefined. This greatly enhances 
   the power of Measurements.jl without effort for the users
   - Functional correlation between variables is correctly handled, so x-x 
   == zero(x), x/x == one(x), tan(x) == sin(x)/cos(x), etc...
   - Support for arbitrary precision 
   

 
   numbers with uncertainties (though this may not be very useful for 
   quantities that are intrinsically imprecise)
   - Define arrays of measurements and perform calculations with them
   - Propagate uncertainty for any function of real arguments (including 
   functions based on C/Fortran calls 
   ), 
   using @uncertain macro 
   
   - Functions to get the derivative and the gradient of an expression with 
   respect to one or more independent measurements
   - Functions to calculate standard score 
    and weighted mean 
   
   - Easy way to attach the uncertainty to a number using ± sign

The complete manual of the package is available at 
http://measurementsjl.readthedocs.io/ with many helpful examples.  The 
package is released under terms of MIT "Expat" license.

Looking at the list of uncertainty propagation software 
 on 
Wikipedia, it seems this is one of the few packages supporting complex 
numbers with uncertainties and operations with arrays, and the only one 
supporting arbitrary-precision arithmetic.

What impressed me most during the development of the package, is that 
thanks to how Julia is designed I actually didn't do anything to gain 
support for complex measurements, arbitrary-precision arithmetic and 
operations with array.  There is not a single line in Measurements.jl's 
code to provide support for all these features.  Only for complex numbers I 
had to choose wisely the subtyping of new Measurement type (and Steven 
Johnson helped me on this, thanks!).  For comparison, Python package 
uncertainties  doesn't support 
complex numbers with uncertainties nor multiple precision arithmetic, and 
you need to load a separate module (so the developer had to write specif 
code for that) in order perform operations with NumPy arrays.

Regarding performance, operations with Measurements.jl can be up to 50 
times faster than with uncertainties package.  Let alone that in 
Measurements.jl you can simply write 42 ± 1.3 in order to define a new 
quantity with uncertainty (this nifty feature was suggested me on this 
mailing list, thanks Lucas Magno!).

Bye,
Mosè


Re: [julia-users] Re: Help with a macro

2016-06-23 Thread Mosè Giordano
Hi Mauro,

2016-06-22 21:13 GMT+02:00 Mauro:
>
> A REPL session of mine trying to figure out how to make that macro would 
> look something like this: 
>
> julia> ex = :(f(x, y, z, zz)) # this is what is passed into the macro 
> :(f(x,y,z,zz)) 
>
> julia> xdump(ex)  # xdump is nice to get an overview of nested 
> datastructures 
> Expr 
>   head: Symbol call 
>   args: Array(Any,(5,)) 
> 1: Symbol f 
> 2: Symbol x 
> 3: Symbol y 
> 4: Symbol z 
> 5: Symbol zz 
>   typ: Any::DataType  <: Any 
>

I knew and used "dump" but not "xdump".  I've seen that the latter has been 
deprecated in favor of the former in Julia 0.5.
 

> julia> target = :(f([x.re, y.re, z.re, zz.re].^2 + [x.im, y.im, z.im, 
> zz.im])) # this is your target 
> :(f([x.re,y.re,z.re,zz.re] .^ 2 + [x.im,y.im,z.im,zz.im])) 
>
> julia> xdump(target) 
> Expr 
>   head: Symbol call 
>   args: Array(Any,(2,)) 
> 1: Symbol f 
> 2: Expr 
>   head: Symbol call 
>   args: Array(Any,(3,)) 
> 1: Symbol + 
> 2: Expr 
>   head: Symbol call 
>   args: Array(Any,(3,)) 
>   typ: Any::DataType  <: Any 
> 3: Expr 
>   head: Symbol vect 
>   args: Array(Any,(4,)) 
>   typ: Any::DataType  <: Any 
>   typ: Any::DataType  <: Any 
>   typ: Any::DataType  <: Any 
>
>
> Now, writing the macro is really an exercise in manipulating nested data 
> structures.  With the added bonus that there are convenient constructors 
> for those datastructures, namely expression such as e.g. :(z = 4 +7). 
>
> julia> fn = ex.args[1] # extract the name of the function 
> :f 
>
> julia> args = ex.args[2:end] # its arguments 
> 4-element Array{Any,1}: 
>  :x 
>  :y 
>  :z 
>  :zz 
>
> julia> ar1 =:([])  # build up the first array 
> :([]) 
>
> julia> ar1.head # check 
> :vect 
>
> julia> [push!(ar1.args, :($(args[i]).re)) for i=1:length(args)] #add the 
> elements 
> 4-element Array{Any,1}: 
>  Any[:(x.re),:(y.re),:(z.re),:(zz.re)] 
>  Any[:(x.re),:(y.re),:(z.re),:(zz.re)] 
>  Any[:(x.re),:(y.re),:(z.re),:(zz.re)] 
>  Any[:(x.re),:(y.re),:(z.re),:(zz.re)] 
>
> julia> ar1 
> :([x.re,y.re,z.re,zz.re]) 
>
> julia> :($ar1^2) # square 
> :([x.re,y.re,z.re,zz.re] ^ 2) 
>
> etc. 
>
> It takes a bit of practice and patience.  Important, as Kristoffer said, 
> write out and test the code you expect the macro to form (at least until 
> you become a macro ninja), then write the macro.  For instance in one of 
> my projects I hand wrote all of the code I later built macros to 
> generate: 
> https://github.com/mauro3/Traits.jl/blob/master/test/manual-traitdef.jl 
>

Thank you so much, I've been able to change my "@uncertain" macro for 
Measurements.jl  package 
following your hints.  In the end, the trick was to quote rather than to 
evaluate the array of arguments.  Probably this is a good rule of thumb 
when writing macros.

Bye,
Mosè


[julia-users] Re: Help with a macro

2016-06-22 Thread Mosè Giordano
Ok, here is a simpler example, not specific to Measuments.jl.

From

@foo f(x, y, z, ...)

get

f([x.re, y.re, z.re, ...].^2 + [x.im, y.im, z.im, ...])

The expression provided to the macro should can take any number of 
arguments.  For example:

x = complex(3.2, -5.2)
y = complex(8.9, 0.4)
@foo(log(x, y)) == log([3.2, 8.9].^2 + [-5.2, 0.4])

So my original question boils down to: how to get specific fields of input 
arguments to a macro without evaluating them?

Bye,
Mosè


[julia-users] Re: Help with a macro

2016-06-21 Thread Mosè Giordano
2016-06-21 23:02 GMT+02:00 Kristoffer Carlsson:
>
>
> Your macro should not evaluate anything per say. It should generate 
> expressions that when the code is run will call functions. 
>
> Say a user types @uncertain log(x,y), what do you want the resulting 
> expression after macro expansion has run to be?
>

This should be the expression to be run in the end:

result(log(x.val, y.val),
   (Calculus.gradient(x -> log(x...), [x.val, y.val])),
   (x, y))

Thank you,
bye,
Mosè


[julia-users] Re: Help with a macro

2016-06-21 Thread Mosè Giordano
Hi Kristoffer,

2016-06-21 22:19 GMT+02:00 Kristoffer Carlsson:
>
> Using @eval in a macro is generally the wrong thing to do. A macro should 
> simply transform an expression to another expression. 


Ok, that's probably what I'm missing, thank you!
 

> If you give an example function body and what you want the macro to 
> transform it to it would be easier to visualize what you want to do.
>

What the macro does is to evaluate the function at the nominal values of 
the Measurement objects provided, calculate the derivative or gradient 
(depending on the arity of the function), and then pass the results to the 
"result" function defined within the package (you don't need to know what 
it does, but if you're curios it's widely commented a few lines above in 
the source code of the package).

The "a" variable in the "then" branch holds the list of arguments, all of 
Measurement type.  I need to get from all of them the nominal value, that 
is the "val" field, in order to evaluate the provided function for those 
values.

The user can pass to the "uncertain" macro any function taking all 
arguments of type "Measurement".  So something like

@uncertain log(x, y)
@uncertain hypot(a, b, c, d)

with a, b, c, d, x, ,y of type Measurement, should work.

Is is clearer now or do you need more information?

Bye,
Mosè


[julia-users] Help with a macro

2016-06-21 Thread Mosè Giordano
Hi all,

in Measurements.jl , an 
uncertainty propagation library, I added a macro to propagate uncertainty 
for an arbitrary function.  Here you can see some examples of use of the 
macro: 
http://measurementsjl.readthedocs.io/en/latest/examples.html#uncertain-macro

The current implementation is here: 
https://github.com/giordano/Measurements.jl/blob/c28a00a6d9f3078e964d5f3db85321d2a815893f/src/math.jl#L112-L135
  
Currently, the macro can be applied to functions with one or two arguments, 
but I'd like to expand to any number of arguments.  I came up with this 
code:

macro uncertain(expr::Expr)
f = esc(expr.args[1]) # Function name
n = length(expr.args) - 1
if n == 1
x = esc(expr.args[2]) # Argument, of Measurement type
return :( result($f($x.val), Calculus.derivative($f, $x.val), $x) )
else
a = expr.args[2:end] # Arguments, as an array of expressions
args = Array(Measurement, n) # Array of evaluated arguments
argsval = Array(AbstractFloat, n) # Array of nominal values of 
arguments
@inbounds for i in 1:n
args[i] = eval(a[i])
argsval[i] = args[i].val
end
return :( result($f($argsval...),
 (Calculus.gradient(x -> $f(x...), [$(esc(argsval
))...])...),
 ($args...)) )
end
end

This works if I execute the macro two or more arguments (the "else" branch 
of the if):

using Measurements
@uncertain log(2.4±0.1, 3.9±0.2)

but not when the arguments are named variables

using Measurements
x = 2.4±0.1
y = 3.9±0.2
@uncertain log(x, y)

In this case I get the following error:

ERROR: UndefVarError: x not defined

It stops at "eval(a[i])" in for loop. Instead, everything works fine in the 
case of functions with one argument (the "then" branch of the if inside the 
macro):

@uncertain log(3.9±0.2)
y = 3.9±0.2
@uncertain log(y)

In addition, the macro above works just fine if I define it in the global 
scope, it doesn't work only if I define it inside the module.

Can you please explain me what I'm missing?

Cheers,
Mosè


Re: [julia-users] Re: [ANN & RFC] Measurements.jl: Uncertainty propagation library

2016-06-17 Thread Mosè Giordano
2016-06-16 10:12 GMT+02:00 Mosè Giordano:
> Hi all,
>
> an update: I continued the development of this package, you can have a look
> at the available features here:
> https://github.com/giordano/Measurements.jl#features-list
>
> In addition, I implemented support for correlation between variables, so
> that x - x == zero(x), x/x == one(x), tan(x) == sin(x)/cos(x), and so on.
> This is done in "correlation" branch, but currently performance is less than
> suboptimal.  According to @benchmark, a simple operation like addition
> between two independent (uncorrelated) variables takes ~ 20 µs, against the
> ~15 ns required by the implementation in "master" branch (without support
> for correlation).
>
> You can see how the Measurement type is defined and constructed here:
> https://github.com/giordano/Measurements.jl/blob/1fddcc848c807d3fb96b9b724c1039359597c14d/src/Measurements.jl#L31-L57
> In order to handle correlation between variables, I keep in each Measurement
> object a list of all independent variables from which it's derived in the
> form of a dictionary.  Here you can see how this list is used to compute the
> final uncertainty:
> https://github.com/giordano/Measurements.jl/blob/1fddcc848c807d3fb96b9b724c1039359597c14d/src/math.jl#L26-L95
> The most relevant function is the second one, which is related to
> mathematical operations with two or more operands, so the cases where
> correlation may occur.  This is all you need to understand how all this
> works.
>
> Do you have suggestions on how to improve the performance?  I feel that
> dictionary is a very handy type but it isn't particularly efficient in this
> case, only *creating* a Measurement object takes ~2 ns in "master" branch,
> and ~800 ns in "correlation" branch and Julia 0.4.5 (it's something less in
> Julia 0.5, but always of the order of hundreds of nanoseconds); source:
> @benchmark 3.0 ± 1.0.  I was thinking of creating a new type for handling
> the list of derivatives of independent variables, but how this could be
> defined in order to make it efficient?

Never mind, I sorted this out by myself.  By digging into Julia's
source code I found the Base.ImmutableDict type (by the way: why is
this type not exported nor documented?), that is a lightweight,
immutable dictionary, exactly what I was looking for.  Now, thanks to
the use of this type instead of Dict (and the reorganization of a loop
in order to avoid memory allocations) the addition of two independent
and uncorrelated variables takes just ~500 ns, this is a 40x
improvement with respect to the use of Dict.

Bye,
Mosè


[julia-users] Re: [ANN & RFC] Measurements.jl: Uncertainty propagation library

2016-06-16 Thread Mosè Giordano
Hi all,

an update: I continued the development of this package, you can have a look 
at the available features here: 
https://github.com/giordano/Measurements.jl#features-list

In addition, I implemented support for correlation between variables 
, so that x - x == 
zero(x), x/x == one(x), tan(x) == sin(x)/cos(x), and so on.  This is done 
in "correlation" branch, but currently performance is less than 
suboptimal.  According to @benchmark, a simple operation like addition 
between two independent (uncorrelated) variables takes ~ 20 µs, against the 
~15 ns required by the implementation in "master" branch (without support 
for correlation).

You can see how the Measurement type is defined and constructed here: 
https://github.com/giordano/Measurements.jl/blob/1fddcc848c807d3fb96b9b724c1039359597c14d/src/Measurements.jl#L31-L57
  
In order to handle correlation between variables, I keep in each Measurement 
object a list of all independent variables from which it's derived in the 
form of a dictionary.  Here you can see how this list is used to compute 
the final uncertainty: 
https://github.com/giordano/Measurements.jl/blob/1fddcc848c807d3fb96b9b724c1039359597c14d/src/math.jl#L26-L95
  
The most relevant function is the second one, which is related to 
mathematical operations with two or more operands, so the cases where 
correlation may occur.  This is all you need to understand how all this 
works.

Do you have suggestions on how to improve the performance?  I feel that 
dictionary is a very handy type but it isn't particularly efficient in this 
case, only *creating* a Measurement object takes ~2 ns in "master" branch, 
and ~800 ns in "correlation" branch and Julia 0.4.5 (it's something less in 
Julia 0.5, but always of the order of hundreds of nanoseconds); source: 
@benchmark 
3.0 ± 1.0.  I was thinking of creating a new type for handling the list of 
derivatives of independent variables, but how this could be defined in 
order to make it efficient?

Bye,
Mosè


[julia-users] Regression of promote with 6 or more arguments in Julia 0.5

2016-06-01 Thread Mosè Giordano
Hi all,

I'm noticing a large (factor of >100) regression of performance of promote 
function when fed with more than 5 arguments ("julia" is Julia 0.4.5; 
"./julia" is the official current nightly build, version 0.5.0-dev+4438, 
commit aa1ce87):

% julia -e 'using Benchmarks;print(@benchmark promote(1,2,3,4,5))'
 Benchmark Results 
 Time per evaluation: 2.00 ns [1.96 ns, 2.04 ns]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 0.00 bytes
   Number of allocations: 0 allocations
   Number of samples: 4101
   Number of evaluations: 99601
 R² of OLS model: 0.952
 Time spent benchmarking: 1.20 s
% ./julia -e 'using Benchmarks;print(@benchmark promote(1,2,3,4,5))'
 Benchmark Results 
 Time per evaluation: 2.11 ns [2.06 ns, 2.16 ns]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 0.00 bytes
   Number of allocations: 0 allocations
   Number of samples: 3801
   Number of evaluations: 74701
 R² of OLS model: 0.950
 Time spent benchmarking: 2.35 s
% julia -e 'using Benchmarks;print(@benchmark promote(1,2,3,4,5,6))'  
 Benchmark Results 
 Time per evaluation: 2.38 ns [2.34 ns, 2.42 ns]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 0.00 bytes
   Number of allocations: 0 allocations
   Number of samples: 6301
   Number of evaluations: 811601
 R² of OLS model: 0.956
 Time spent benchmarking: 1.75 s
% ./julia -e 'using Benchmarks;print(@benchmark promote(1,2,3,4,5,6))'
 Benchmark Results 
 Time per evaluation: 306.79 ns [300.44 ns, 313.14 ns]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 144.00 bytes
   Number of allocations: 3 allocations
   Number of samples: 4001
   Number of evaluations: 90501
 R² of OLS model: 0.955
 Time spent benchmarking: 2.64 s

I get the same results also with a custom build of the same commit (I 
thought I had a problem with my build and then downloaded the nightly build 
from the site).  Actually, something similar happens also for Julia 0.4 but 
starting from 9 arguments:

% julia -e 'using Benchmarks;print(@benchmark promote(1,2,3,4,5,6,7,8))'  
 Benchmark Results 
 Time per evaluation: 3.71 ns [3.63 ns, 3.79 ns]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 0.00 bytes
   Number of allocations: 0 allocations
   Number of samples: 3801
   Number of evaluations: 74701
 R² of OLS model: 0.955
 Time spent benchmarking: 1.13 s
% julia -e 'using Benchmarks;print(@benchmark promote(1,2,3,4,5,6,7,8,9))'
 Benchmark Results 
 Time per evaluation: 6.27 μs [4.53 μs, 8.02 μs]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 928.00 bytes
   Number of allocations: 29 allocations
   Number of samples: 100
   Number of evaluations: 100
 Time spent benchmarking: 0.18 s
% julia -e 'using Benchmarks;print(@benchmark 
promote(1,2,3,4,5,6,7,8,9,10))'
 Benchmark Results 
 Time per evaluation: 12.85 μs [11.13 μs, 14.56 μs]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 1.11 kb
   Number of allocations: 33 allocations
   Number of samples: 100
   Number of evaluations: 100
 Time spent benchmarking: 0.18 s

Is this normal?

Bye,
Mosè


Re: [julia-users] Re: Is Julia slow with large arrays?

2016-06-01 Thread Mosè Giordano
Oh my bad, this was really easy to fix!  I usually do inspect the code
with @code_warntype but missed to do the same this time.  Lesson
learned.

Now the same loop is about ~30 times faster in Julia than in Fortran,
really impressive.

Thank you all for the prompt comments!

Bye,
Mosè


2016-06-01 13:27 GMT+02:00 Lutfullah Tomak:
> First thing I caught in your code
>
> http://docs.julialang.org/en/release-0.4/manual/performance-tips/#avoid-changing-the-type-of-a-variable
>
> make bar non-type-changing
> function foo()
> array1 = rand(70, 1000)
> array2 = rand(70, 1000)
> array3 = rand(2, 70, 20, 20)
> bar = 0.0
> @time for l = 1:1000, k = 1:20, j = 1:20, i = 1:70
> bar = bar +
> (array1[i, l] - array3[1, i, j, k])^2 +
> (array2[i, l] - array3[2, i, j, k])^2
> end
> end
>
> Second, Julia checks array bounds so @inbounds macro before for-loop should
> help
> improve performance. In some situation @simd may emit vector instructions
> thus faster
> code.


[julia-users] Is Julia slow with large arrays?

2016-06-01 Thread Mosè Giordano
Hi all,

I'm working on a Fortran77 code, but I'd much prefer to translate it to 
Julia.  However, I've the impression that when working with large arrays 
(of the order of tens of thousands elements) Julia is way slower than the 
equivalent Fortran code.  See the following examples:

Julia:

function foo()
array1 = rand(70, 1000)
array2 = rand(70, 1000)
array3 = rand(2, 70, 20, 20)
bar = 0
@time for l = 1:1000, k = 1:20, j = 1:20, i = 1:70
bar = bar +
(array1[i, l] - array3[1, i, j, k])^2 +
(array2[i, l] - array3[2, i, j, k])^2
end
end
foo()

Fortran77 (it uses some GNU extensions, so gfortran is required):

  program main
  implicit none
  double precision array1(70, 1000), array2(70, 1000),
 &array3(2, 70, 20, 20), bar, start, finish
  integer i, j, k, l
  call srand(time())
c Initialize "array1" and "array2"
  do j = 1, 1000
do i = 1, 70
  array1(i, j) = rand()
  array2(i, j) = rand()
enddo
  enddo
c Initialize "array3"
  do l = 1, 2
do k = 1, 70
  do j = 1, 20
do i = 1, 20
  array3(i, j, k, l) = rand()
enddo
  enddo
enddo
  enddo
c Do the calculations
  bar = 0
  call cpu_time(start)
  do l = 1, 1000
do k = 1, 20
  do j = 1, 20
do i = 1, 70
  bar = bar +
 &(array1(i, l) - array3(1, i, j, k))**2 +
 &(array2(i, l) - array3(2, i, j, k))**2
enddo
  enddo
enddo
  enddo
  call cpu_time(finish)
  print "(f10.6, a)", finish - start, " seconds"
  end program main

This is the result of running the two programs on my computer:

% julia --version
julia version 0.4.5
% gfortran --version
GNU Fortran (Debian 5.3.1-20) 5.3.1 20160519
Copyright (C) 2015 Free Software Foundation, Inc.

GNU Fortran comes with NO WARRANTY, to the extent permitted by law.
You may redistribute copies of GNU Fortran
under the terms of the GNU General Public License.
For more information about these matters, see the file named COPYING

% julia -f test.jl
  1.099910 seconds (84.00 M allocations: 1.252 GB, 7.14% gc time)
% gfortran test.f&&./a.out
  0.132000 seconds

While Julia code is 3 times shorter than the Fortran77 code (and this one 
of the many reasons why I like Julia very much), it's also more than 8 
times slower and my build of Julia 0.5 (updated to aa1ce87) performs even 
worse, it takes about 1.4 seconds.  If I remove access to the arrays (just 
put "bar = bar" as loop body) then Julia is infinitely faster than Fortran 
in the sense that Julia takes 0.00 seconds, Fortran 0.056000.

Is this difference to be expected or are there tricks to fasten the Julia 
code?  I should have gotten the order of nested loops right.  @inbounds 
doesn't help that much.  Parallelization is probably not an option because 
the code above is a small part of a larger loop that does data movement 
(can DistributedArrays help?).

Bye,
Mosè


[julia-users] Re: Quadratic equation solver ?

2016-05-31 Thread Mosè Giordano

>
> which one is a good  solver(open source) for quadratic equation using jump 
> ? Thank you 
>

What is jump?

PolynomialRoots.jl is a 
package to find all roots (real and complex, they will have always Complex 
type) of polynomials of any degree, with a specialized function for 
quadratic equations.  Though, I don't know if this is what you're looking 
for.

Bye,
Mosè


Re: [julia-users] Re: [ANN] PolynomialRoots.jl: Fast Complex Polynomial Root Finder

2016-05-27 Thread Mosè Giordano
Dear all,

I've just noticed that a few weeks ago Jan Skowron clarified the license of 
original Fortran library: 
http://www.astrouw.edu.pl/~jskowron/cmplx_roots_sg/  It's clear now that 
one can choose to apply either Apache license or LGPL.  The customary 
scientific license has been dropped, now there is a simple invitation to 
cite the paper.

In order to make PolynomialRoots.jl 
<https://github.com/giordano/PolynomialRoots.jl> as widely usable as 
possible in Julia environment, does it make sense to preserve this dual 
license for the package or it's better to choose one of them in order to 
avoid confusion?  My understanding is that both licenses are MIT-compatible 
in the sense that both LGPL and Apache codes can be linked in MIT code.

Cheers,
Mosè



Great – they obviously intended for this to be open source and used, so I'm 
> sure they won't mind clarifying.
>
> On Thu, May 5, 2016 at 7:39 PM, Mosè Giordano  > wrote:
>
>> Hi Steven and Stefan,
>>
>> 2016-05-05 22:29 GMT+02:00 Stefan Karpinski > >:
>> > The phrasing of this licensing is unclear but essentially the same as 
>> the
>> > original Fortran library:
>> >
>> >> The authors release the source codes associated with the Paper under 
>> terms
>> >> of the GNU Lesser General Public License version 2 or any later 
>> version, or
>> >> under the Apache License, Version 2.0 as well as under a "customary
>> >> scientific license", which implies that if this code was important in 
>> the
>> >> scientific process or for the results of your scientific work, we ask 
>> for
>> >> the appropriate citation of the Paper (Skowron & Gould 2012).
>> >
>> >
>> > Their wording seems to indicate dual licensing under LGPL 2 or Apache 2,
>> > which would mean that following the terms of either license gives the 
>> right
>> > to use the software. But then it throws in the "as well as under a
>> > 'customary scientific license'" clause, which completely muddies the 
>> waters.
>> > Does that mean that you may use the software if you follow the terms of 
>> LGPL
>> > 2 AND cite them OR follow the terms of Apache 2 AND cite them? Or that 
>> you
>> > may use the software if you follow the terms of LGPL 2 OR cite them OR
>> > follow the terms of Apache 2 and cite them? Under the former 
>> interpretation,
>> > it would be illegal to use this software as part of a derived work 
>> including
>> > any GPL libraries (which includes Julia in its default configuration) 
>> since
>> > the "customary scientific license" conflicts with the GPL, thereby 
>> making it
>> > impossible to comply with all terms required to be allowed to use the
>> > combined product.
>> >
>> > If the authors intended to require you to follow both the LGPL 2 and 
>> Apache
>> > 2 licenses, the situation may be even worse, since, IIRC, those licenses
>> > themselves conflict, so it would be impossible to satisfy the conditions
>> > required to be allowed to use the software at all.
>> >
>> > It seems like it may be necessary to contact the authors and request 
>> their
>> > clarification of the terms under which one may use their software.
>>
>> I confirm I used the same license as the original library.  Actually,
>> I interpreted the "customary scientific license" as a request, not as
>> a legal obligation (but I'm not a native English speaker nor a
>> lawyer).  The fact that two or more people can't agree on the
>> interpretation shows that probably the license requires clarification,
>> I'll contact the author and ask for it.
>>
>> Bye,
>> Mosè
>>
>
>

Re: [julia-users] Passing a function outputted by a higher order function to C - function not yet c-callable

2016-05-22 Thread Mosè Giordano
Hi Michael,
 

> Thank you for the quick reply. Could you please elaborate on how to pass 
> the data explicitly as you suggest? I don't quite understand.
>

This is a very useful post about closures in general: 
http://julialang.org/blog/2013/05/callback

Bye,
Mosè


Re: [julia-users] Re: [ANN & RFC] Measurements.jl: Uncertainty propagation library

2016-05-19 Thread Mosè Giordano
Hi Andre,

2016-05-19 9:05 GMT+02:00 Andre Bieler:
> As an experimental physicist I thank you very much for this package! :D

You're welcome :-)  If you have wishes or suggestions, please share them!

Bye,
Mosè


[julia-users] Re: [ANN & RFC] Measurements.jl: Uncertainty propagation library

2016-05-17 Thread Mosè Giordano
Hi Lucas,

Looks great!
>
> How about overloading the ± operator?
> using Measurements
>
> ±(μ, σ) = Measurement(μ, σ)
> # => ± (generic function with 1 method)
>
> a = 4.5 ± 0.1
> # 4.5 ± 0.1
>
>
> I don't know if it's defined in other packages, but I've never seen it.
>

 This is a great idea, that's why I love Julia ;-)  Thanks a lot for your 
suggestion, I already implemented it with latest commit!

Bye,
Mosè


[julia-users] [ANN & RFC] Measurements.jl: Uncertainty propagation library

2016-05-17 Thread Mosè Giordano
Dear all,

I'd like to share with you my new simple package Measurements.jl 
, a library for propagation of 
uncertainty.  The package provides a new type, Measurement, that is 
composed by the value and its associated error, assumed to be a standard 
deviation.  Then, some basic mathematical operations are redefined to 
handle Measurement type and return a Measurement object with error computed 
according to rules of uncertainty propagation of standard deviations.

I started this new function because I didn't find anything like this in 
http://pkg.julialang.org/  Please tell me if I missed a registered 
package.  Before further developing my package, I'd like to have some 
comments about it and how to improve its design.  The fields of Measurement 
type are arbitrary in order to make the library compatible with as many 
library as possible (I was thinking about compatibility with SIUnits, for 
example), but operations like zero, hypot, and inv must make sense for the 
field types (SIUnits misses hypot), in addition to the other mathematical 
operations.

The package, released under terms of MIT "Expat" License, isn't registered 
yet, but you can check it out with

Pkg.clone("https://github.com/giordano/Measurements.jl";)

Here are some examples taken from the README:

using Measurements
a = Measurement(4.5, 0.1)
# => 4.5 ± 0.1
b = Measurement(3.8, 0.4)
# => 3.8 ± 0.4
2a + b
# => 12.8 ± 0.4472135954999579
a - 1.2b
# => -0.05961 ± 0.49030602688525043
l = Measurement(0.936, 1e-3);
T = Measurement(1.942, 4e-3);
P = 4pi^2*l/T^2
# => 9.797993213510699 ± 0.041697817535336676
c = Constant(4)
# => 4 ± 0
a*c
# => 18.0 ± 0.4
sind(Measurement(94, 1.2))
# => 0.9975640502598242 ± 0.0014609761696991563

Cheers,
Mosè


Re: [julia-users] Re: [ANN] PolynomialRoots.jl: Fast Complex Polynomial Root Finder

2016-05-05 Thread Mosè Giordano
Hi Steven and Stefan,

2016-05-05 22:29 GMT+02:00 Stefan Karpinski :
> The phrasing of this licensing is unclear but essentially the same as the
> original Fortran library:
>
>> The authors release the source codes associated with the Paper under terms
>> of the GNU Lesser General Public License version 2 or any later version, or
>> under the Apache License, Version 2.0 as well as under a "customary
>> scientific license", which implies that if this code was important in the
>> scientific process or for the results of your scientific work, we ask for
>> the appropriate citation of the Paper (Skowron & Gould 2012).
>
>
> Their wording seems to indicate dual licensing under LGPL 2 or Apache 2,
> which would mean that following the terms of either license gives the right
> to use the software. But then it throws in the "as well as under a
> 'customary scientific license'" clause, which completely muddies the waters.
> Does that mean that you may use the software if you follow the terms of LGPL
> 2 AND cite them OR follow the terms of Apache 2 AND cite them? Or that you
> may use the software if you follow the terms of LGPL 2 OR cite them OR
> follow the terms of Apache 2 and cite them? Under the former interpretation,
> it would be illegal to use this software as part of a derived work including
> any GPL libraries (which includes Julia in its default configuration) since
> the "customary scientific license" conflicts with the GPL, thereby making it
> impossible to comply with all terms required to be allowed to use the
> combined product.
>
> If the authors intended to require you to follow both the LGPL 2 and Apache
> 2 licenses, the situation may be even worse, since, IIRC, those licenses
> themselves conflict, so it would be impossible to satisfy the conditions
> required to be allowed to use the software at all.
>
> It seems like it may be necessary to contact the authors and request their
> clarification of the terms under which one may use their software.

I confirm I used the same license as the original library.  Actually,
I interpreted the "customary scientific license" as a request, not as
a legal obligation (but I'm not a native English speaker nor a
lawyer).  The fact that two or more people can't agree on the
interpretation shows that probably the license requires clarification,
I'll contact the author and ask for it.

Bye,
Mosè


[julia-users] [ANN] PolynomialRoots.jl: Fast Complex Polynomial Root Finder

2016-05-05 Thread Mosè Giordano
Hi,

a couple of days ago I released version 0.0.2 of PolynomialRoots.jl 
 (formerly known as 
CmplxRoots.jl, some people suggested to change the name to something more 
descriptive), a fast root finder of real and complex polynomials.

This is a Julia implementation of the algorithm described in

   - J. Skowron & A. Gould, 2012, "General Complex Polynomial Root Solver 
   and Its Further Optimization for Binary Microlenses", arXiv:1203.103 
   

PolynomialRoots.jl is a registered Julia package, so you can install it 
with the package manager:

Pkg.add("PolynomialRoots")

Two functions are exposed to the users:

roots(polynomial[, roots, polish=true])
roots5(polynomial[, roots])

The first function can be used to solve polynomials of any order, the 
second one is specialized and optimized for (and works only for) 
polynomials of fifth-order


This new version of PolynomialRoots.jl features support for arbitrary 
precision calculations.  This is very useful for calculating roots of 
polynomials of very large degree (some hundreds) that couldn't be solved 
using double precision calculations.  Actually, this cures inaccurate 
results also for low order polynomials, like second-order ones, where 
catastrophic cancellation is a problem.  For example, the actual roots of 
(see https://en.wikipedia.org/wiki/Loss_of_significance)


94906265.625*x^2 - 189812534*x + 94906268.375 

are

x_1 = 1.00028975958
x_2 = 1.000 

but when you try to calculate them in double-precision you get 

julia> r = roots([94906268.375, -189812534, 94906265.625]); 

julia> r[1]
1.000144879793 - 0.0im 

julia> r[2]
1.000144879788 + 0.0im

If you are interested in double-precision accuracy, you can work around 
this problem by calculating the roots with higher precision and then 
transforming the result to double-precision. What you have to do is only to 
pass BigFloat numbers to roots function:

julia> r = roots([BigFloat(94906268.375), BigFloat(-189812534), BigFloat(
94906265.625)]);

julia> Complex128(r[1])
1.000289759583 - 0.0im

julia> Complex128(r[2])
1.0 + 0.0im

as expected.

This package is licensed under Apache License 2.0 or GNU Lesser General 
Public License version 3 or any later version, as well as under a 
"customary scientific license", which implies that if this code was 
important in the scientific process or for the results of your scientific 
work, you are asked for the appropriate citation of the paper Skowron & 
Gould 2012 (http://arxiv.org/abs/1203.1034).

Cheers,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-30 Thread Mosè Giordano
2016-04-30 14:22 GMT+02:00 Sheehan Olver:
>
> Hi Mose,
>
> I think AMVW is specifically designed for large degree polynomials: it's 
> an O(n^2) algorithm that works via orthogonal transformations on the 
> companion matrix
>
> How do the timings compare on, say, a degree 200 polynomial?
>

Just replace 101 with 201 in my previous script.  For 200th-degree 
polynomial, PolynomialRoots.jl is still a bit faster, AMVW becomes faster 
at even larger  degrees:

INFO: 400th-order polynomial

PR maximum error:   3.1349840509961133
AMVW maximum error: 8.091848829362318e30

PR benchmark:
 Benchmark Results 
 Time per evaluation: 133.95 ms [125.45 ms, 142.45 ms]
Proportion of time in GC: 0.07% [0.00%, 0.27%]
Memory allocated: 1.20 mb
   Number of allocations: 38837 allocations
   Number of samples: 75
   Number of evaluations: 75
 Time spent benchmarking: 10.27 s

AMVW:
 Benchmark Results 
 Time per evaluation: 52.97 ms [52.36 ms, 53.57 ms]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 33.64 kb
   Number of allocations: 9 allocations
   Number of samples: 100
   Number of evaluations: 100
 Time spent benchmarking: 5.45 s

I wouldn't say that speed is of any utility if one gets such inaccurate 
results.  Hpwever, I feel that for large degree polynomials limited 
precision is set to be always a problem, either in root finding or 
polynomial evaluation.

Bye,
Mosè


[julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-30 Thread Mosè Giordano
Hi Sheehan,

2016-04-30 8:12 GMT+02:00 Sheehan Olver:
>
> How does this compare to AMVW.jl?
>
> https://github.com/andreasnoack/AMVW.jl
>

I compared both programs with this script (requires development version of 
PolynomialRoots):

using Benchmarks, PolynomialRoots, AMVW

for polynomial in ([120im, -(184 + 90im), (138 - 57im), (54im - 15), -(6 + 
9im), 1.],
   rand(Complex{Float64}, 101))
info("$(length(polynomial)-1)th-order polynomial")
PRroots  = PolynomialRoots.roots(polynomial)
AMVWroots = AMVW.rootsAMVW(polynomial)
println("\nPR maximum error:   ",
maximum(abs(PolynomialRoots.evalpoly(PRroots, polynomial
println("AMVW maximum error: ",
maximum(abs(PolynomialRoots.evalpoly(AMVWroots, polynomial
println("\nPR benchmark:\n", @benchmark PolynomialRoots.roots(polynomial
))
println("AMVW:\n", @benchmark AMVW.rootsAMVW(polynomial))
end

 This is the result:

INFO: 5th-order polynomial

PR maximum error:   8.526512829121202e-14
AMVW maximum error: 3.0505379675102492e-12

PR benchmark:
 Benchmark Results 
 Time per evaluation: 2.82 μs [2.74 μs, 2.91 μs]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 656.00 bytes
   Number of allocations: 9 allocations
   Number of samples: 2001
   Number of evaluations: 12501
 R² of OLS model: 0.952
 Time spent benchmarking: 0.78 s

AMVW:
 Benchmark Results 
 Time per evaluation: 26.72 μs [24.77 μs, 28.67 μs]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 1.03 kb
   Number of allocations: 9 allocations
   Number of samples: 100
   Number of evaluations: 100
 Time spent benchmarking: 0.12 s

INFO: 100th-order polynomial

PR maximum error:   2.2724270820617676e-7
AMVW maximum error: 1.6100194395356297e-6

PR benchmark:
 Benchmark Results 
 Time per evaluation: 904.96 μs [887.77 μs, 922.15 μs]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 18.45 kb
   Number of allocations: 424 allocations
   Number of samples: 100
   Number of evaluations: 100
 Time spent benchmarking: 0.19 s

AMVW:
 Benchmark Results 
 Time per evaluation: 4.03 ms [3.97 ms, 4.09 ms]
Proportion of time in GC: 0.00% [0.00%, 0.00%]
Memory allocated: 9.41 kb
   Number of allocations: 9 allocations
   Number of samples: 100
   Number of evaluations: 100
 Time spent benchmarking: 0.51 s

So PolynomialRoots.jl is both faster and more precise than AMVW.jl.  Note 
that this package employs an algorithm based on finding eigenvalues of 
companion matrix, like Polynomials.jl.  Note that both algorithms often 
fail to converge to the actual roots for large degree polynomials, but next 
version of PolynomialRoots will feature support for multiple precision 
calculations.  I'll hopefully release it in the next few days.

Bye,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-29 Thread Mosè Giordano
Hi David,

2016-04-30 0:17 GMT+02:00 David P. Sanders :
>
>
> El jueves, 28 de abril de 2016, 16:09:54 (UTC-4), Mosè Giordano escribió:
>>
>> Hi David,
>>
>> 2016-04-28 14:10 GMT+02:00 David P. Sanders :
>> > As far as I could see, the original library has an Apache license, so
>> > you should be able to use MIT.
>> >
>> > I believe that you need to include a copy of the original license in
>> > your package?
>>
>> Original code is dual-licensed, in that case the author of derived
>> work can choose one of the two licenses or both.[1]  I went for LGPL,
>> that is weakly protective
>
>
> What is being protected, your own work?

And of the authors of original library.

>> but perfectly compatible with MIT "Expat"
>
>
> No, it is only half compatible with MIT, i.e. only in one direction: you can
> take code
> from my MIT-licensed package (e.g. ValidatedNumerics) and use it in yours,
> but I *cannot*
> take a piece of code from your package, modify it, and re-use it in mine,
> without "infecting" my package with the LGPL code, which will automatically
> mean that
> I will have to change my license to LGPL.
>
> (This is all only my understanding. Please do correct me if I am wrong.)

No, this isn't correct.  You can take an LPGL library and use it in
your (even proprietary) program without the need to change the license
of your program.  What you can't do is to change the license of the
LGPLed library itself into a more permissive one, but the library
doesn't affect the license of programs making use of it.  This is the
main difference with GLP.  Remember that the first "L" stands for
"Lesser" ;-)  For this reason, an LGPL library can be used in a MIT
program.

>> license, which is the most common in Julia ecosystem, so that most of
>> Julia package can use this package.
>
>
> Again, they can only *use* it.
> I cannot even *look* at your code (which I would very much like to do),
> since if I find something useful, I can't reuse it in my (MIT) package.

You can have the core of your program with MIT license and still
incorporate LGPL code, provided that that part is released under the
terms of LGPL.  Julia itself has some files with separate licenses,
including LGPL:
https://github.com/JuliaLang/julia/blob/master/LICENSE.md  This
doesn't affect the rest of the program.

>> For the time being I don't plan
>> to change the license.
>
>
> I would urge you to reconsider that. You could, for example, license it
> with the same dual license as in the original package -- otherwise, you
> are actually restricting your users more than the original package does
> (for the reasons I discussed above).
>
> The end result otherwise will be that someone
> who needs to use the functionality will end up going back to the original
> Fortran code
> and rewriting what you have already done (which I certainly do not want to
> do),
> or using less-good code from somewhere else.

Ok, I'll consider using the same dual-license as in the original
library (no guarantee), but please note my above comments, they should
alleviate your worries ;-)

Bye,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-29 Thread Mosè Giordano
2016-04-28 22:48 GMT+02:00 Mosè Giordano :
> Hi Steven,
>
> 2016-04-28 14:58 GMT+02:00 Steven G. Johnson :
>> On Thursday, April 28, 2016 at 8:46:06 AM UTC-4, Steven G. Johnson wrote:
>>>
>>> Out of curiosity, how does the performance and accuracy compare to the
>>> "roots" function in the Polynomials.jl package, which uses the standard
>>> companion-matrix approach?
>>
>>
>> Just checked: it is much faster than Polynomials.roots, which I guess is not
>> surprising for a Newton-like method compared to an eigenvalue method,
>
> Yes, as I wrote in my first mail I found my package to be ~20 times
> faster than Polynomials.jl and usually more precise (**very** roughly,
> evaluating abs2(evaluation of polynomial at root) result in ~10^-29
> for Polynomials.jl and ~10^-30 for CmplxRoots.jl).  Probably I
> wouldn't have published the package if it had been both slower and less 
> precise.
>
>> although perhaps the latter is more reliable in some cases?
>>
>> I'm not sure what the state of the art here is, though.
>
> Most of the methods I found are based on finding eigenvalues of
> companion matrix.  A somewhat popular method is Jenkins–Traub
> algorithm (https://en.wikipedia.org/wiki/Jenkins%E2%80%93Traub_algorithm),
> but I didn't find a FLOSS implementation of that algorithm in any
> language, to start with.
>
> In my test suite I added a couple of tests of random polynomials, to
> catch major flaws of root finding algorithm.

I hit a limitation of the algorithm used in PolynomailRoots.jl (new
name of CmplxRoots.jl, but it is hasn't been merged in METADATA.jl
yet): it gives highly inaccurate results for large-order polynomials
(above ~30).  This isn't a systematic behavior, it happens from time
to time, but the larger the degree of polynomial, the more failures
you'll get.  Fortunately, this can be mitigated by using new multiple
precision feature.  For the record, Polynomials.jl has the same
problem as PolynomialRoots.jl, with the former still performing worse
than the latter also at high orders.

Regarding interesting algorithms, Aberth–Ehrlich method
(https://en.wikipedia.org/wiki/Aberth_method) looks promising and it
has already been implemented in multiple precision (see
https://en.wikipedia.org/wiki/MPSolve and
http://numpi.dm.unipi.it/mpsolve-2.2/, this isn't FLOSS though).

Bye,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-29 Thread Mosè Giordano
2016-04-29 9:58 GMT+02:00 2n :
> So translating header files for bindings is considered a derived work also?
> And also the GNU guys and the porting of unix sofware to linux, that can't
> be considered derived work. I guess it comes down to the definition of the
> word "expressive" like you say, but programming languages are so precise, I
> wouldn't have thought there is much room for expressiveness.

It depends on what you translate.  If it's really needed to
communicate with the library you link to, that may not be considered a
derived work.  If no creativity is involved you may say that there is
no copyright on that part of the code, but I guess this strongly
depends also on local jurisdictions.

LPGL addresses this question for object code, see section 3:

--8<---cut here---start->8---
  3. Object Code Incorporating Material from Library Header Files.

  The object code form of an Application may incorporate material from
a header file that is part of the Library.  You may convey such object
code under terms of your choice, provided that, if the incorporated
material is not limited to numerical parameters, data structure
layouts and accessors, or small macros, inline functions and templates
(ten or fewer lines in length), you do both of the following:

   a) Give prominent notice with each copy of the object code that the
   Library is used in it and that the Library and its use are
   covered by this License.

   b) Accompany the object code with a copy of the GNU GPL and this license
   document.
--8<---cut here---end--->8---

So, if you limit to something really essential, no copyright is claimed on it.

Bye,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-28 Thread Mosè Giordano
2016-04-28 22:34 GMT+02:00 Steven G. Johnson :
>
>
> On Thursday, April 28, 2016 at 4:09:54 PM UTC-4, Mosè Giordano wrote:
>>
>> > +1 for ComplexRoots.jl
>>
>> Ok, then I'll see how to change the name of the package in METADATA.jl
>
>
> I would suggest something like PolynomialRoots.jl; the names "ComplexRoots"
> are "CmplxRoots" are not very descriptive.

Good point, thanks!

Bye,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-28 Thread Mosè Giordano
Hi Steven,

2016-04-28 14:58 GMT+02:00 Steven G. Johnson :
> On Thursday, April 28, 2016 at 8:46:06 AM UTC-4, Steven G. Johnson wrote:
>>
>> Out of curiosity, how does the performance and accuracy compare to the
>> "roots" function in the Polynomials.jl package, which uses the standard
>> companion-matrix approach?
>
>
> Just checked: it is much faster than Polynomials.roots, which I guess is not
> surprising for a Newton-like method compared to an eigenvalue method,

Yes, as I wrote in my first mail I found my package to be ~20 times
faster than Polynomials.jl and usually more precise (**very** roughly,
evaluating abs2(evaluation of polynomial at root) result in ~10^-29
for Polynomials.jl and ~10^-30 for CmplxRoots.jl).  Probably I
wouldn't have published the package if it had been both slower and less precise.

> although perhaps the latter is more reliable in some cases?
>
> I'm not sure what the state of the art here is, though.

Most of the methods I found are based on finding eigenvalues of
companion matrix.  A somewhat popular method is Jenkins–Traub
algorithm (https://en.wikipedia.org/wiki/Jenkins%E2%80%93Traub_algorithm),
but I didn't find a FLOSS implementation of that algorithm in any
language, to start with.

In my test suite I added a couple of tests of random polynomials, to
catch major flaws of root finding algorithm.

> (I find it a little bit worrying that the Skowron and Gould paper justify
> their method on account of it being "1.6 to 3 times" faster than the roots
> function in Numerical Recipes (NR), and that NR was the inspiration for
> their algorithm.  NR routines typically have deeply suboptimal performance,
> and that book is terrible starting point for developing state-of-the-art
> algorithms because NR's approaches are typically antiquated.)

I know that algorithm showed in Numerical Recipes are often far from
being the state of art (in addition, their routines aren't FLOSS), but
that series of book is fairly common in astrophysics even today, maybe
because at least the first two authors are well known astrophysicists.

I started from Skowron & Gould package because their package is FLOSS,
it's in Fortran and it's easy to be called from Julia (indeed I first
wrote a wrapper around the Fortran shared object, that took me a few
minutes), and I used it often, without major problems.

Bye,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-28 Thread Mosè Giordano
Hi David,

2016-04-28 14:10 GMT+02:00 David P. Sanders :
> As far as I could see, the original library has an Apache license, so you 
> should be able to use MIT.
>
> I believe that you need to include a copy of the original license in your 
> package?

Original code is dual-licensed, in that case the author of derived
work can choose one of the two licenses or both.[1]  I went for LGPL,
that is weakly protective but perfectly compatible with MIT "Expat"
license, which is the most common in Julia ecosystem, so that most of
Julia package can use this package.  For the time being I don't plan
to change the license.

> +1 for ComplexRoots.jl

Ok, then I'll see how to change the name of the package in METADATA.jl

In a separate branch of current repository I'm also working on adding
support for arbitrary precision calculations, so the renamed package
will feature this new capability.

Bye,
Mosè


Note
[1] See also http://www.astrouw.edu.pl/~jskowron/cmplx_roots_sg/: «The
authors release the source codes associated with the Paper under terms
of the GNU Lesser General Public License version 2 or any later
version, **or** under the Apache License, Version 2.0»


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-28 Thread Mosè Giordano
Hi,

2016-04-28 12:57 GMT+02:00 DNF :
> Just a question: Is there some particular reason you chose to to skip the
> letters 'o' and 'e' in the name?
>
> It seems like the kind of thing that would make the package hard to find,
> and cause confusion and frustration. In fact, while I immediately saw that
> you skipped the 'e', it read it many times before realizing that the 'o' was
> also gone. Doing Pkg.add() I would have been pretty perplexed, and every
> time >> using Com it would fail.

It stems from the name of the original Fortran code the package is
based on: http://www.astrouw.edu.pl/~jskowron/cmplx_roots_sg/

If also other people will suggest to change the name I can consider
this possibility.

Cheers,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-28 Thread Mosè Giordano
2016-04-27 15:10 GMT+02:00 Mosè Giordano :
> Hi Kristoffer,
>
> 2016-04-27 14:32 GMT+02:00 Kristoffer Carlsson :
>> This looks interesting.
>
> Thank you!
>
>> It is in my opinion a bit unfortunate that you chose
>> to use the GPL license for this since it means that it will not be usable by
>> most packages in Julia which are under MIT license.
>
> Actually I used LGPL, not GPL, because the original Fortran library I
> translated to Julia is released with that license, and, as far as I
> know, you can you an LGPL program into a MIT one.
^^^

Oops, while writing the email I skipped the verb: in place of the
second wrong "you" read "use".  To avoid further confusion, a MIT
program can link to an LPGL one, provided that certain conditions are
met (but no license change is required), see section 4 of LGPLv3 text.

Cheers,
Mosè


Re: [julia-users] Re: [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-27 Thread Mosè Giordano
Hi Kristoffer,

2016-04-27 14:32 GMT+02:00 Kristoffer Carlsson :
> This looks interesting.

Thank you!

> It is in my opinion a bit unfortunate that you chose
> to use the GPL license for this since it means that it will not be usable by
> most packages in Julia which are under MIT license.

Actually I used LGPL, not GPL, because the original Fortran library I
translated to Julia is released with that license, and, as far as I
know, you can you an LGPL program into a MIT one.

Bye,
Mosè


[julia-users] [ANN] CmplxRoots.jl: Fast Complex Polynomial Root Finder

2016-04-27 Thread Mosè Giordano
Dear all,

I am happy to announce the first version of CmplxRoots.jl 
, a fast root finder of real and 
complex polynomials.

This is a Julia implementation of the algorithm described in 

   - J. Skowron & A. Gould, 2012, "General Complex Polynomial Root Solver 
   and Its Further Optimization for Binary Microlenses", arXiv:1203.1034 
   

I first wrote the package as a wrapper around the Fortran implementation 
available at http://www.astrouw.edu.pl/~jskowron/cmplx_roots_sg/ but then I 
rewrote it entirely in Julia, finding that it's even faster than ccall'ing 
functions in the shared object.

CmplxRoots.jl is a registered Julia package, so you can install it with the 
package manager:

Pkg.add("CmplxRoots")

Two functions are exposed to the users:

roots(polynomial[, roots, polish=true])
roots5(polynomial[, roots])

The first function can be used to solve polynomials of any order, the 
second one is specialized and optimized for (and works only for) 
polynomials of fifth-order, but I found it to be a bit slower than roots().

The only mandatory argument is the list of coefficients of the polynomial, 
in ascending order.  So, if you want to find all the roots of polynomial

(x - i)*(x - 2)*(x - 3*i)*(x - 4)*(x - 5*i) =
  x^5 - (9*i + 6)*x^4 + (54*i - 15)*x^3 + (138 - 57*i)*x^2 - (184 + 90*i)*x + 
120*i


you can use

julia> roots([120im, -(184 + 90im), (138 - 57im), (54im - 15), -(6 + 9im), 1
])
5-element Array{Complex{Float64},1}:
 -1.55431e-15+5.0im
 4.0+1.77636e-16im
  1.55028e-15+3.0im
 -1.04196e-16+1.0im
 2.0-2.00595e-16im

julia> roots5([120im, -(184 + 90im), (138 - 57im), (54im - 15), -(6 + 9im), 
1])
5-element Array{Complex{Float64},1}:
  5.92119e-16+5.0im
  4.0-1.4803e-16im
 2.0+1.88202e-16im
 -1.88738e-15+3.0im
  2.10942e-15+1.0im

Optional roots argument is the array of starting values that can be used to 
find the final roots (it won't be changed in-place).  Note that even if the 
roots are all real, the output array is always of type Complex{Float64}.

Also Polynomials.jl package has a root finding function, but according to 
some random tests performed with @benchmark (from Benchmarks.jl package) 
it's roughly 20 times slower than roots() function provided by 
CmplxRoots.jl.  Of course do not expect results with machine precision (you 
can have catastrophic cancellation even for simple quadratic polynomials), 
but some random tests indicate that CmplxRoots.jl is slightly more precise 
than Polynomials.jl (just compute abs2() of polynomials at the root).

The package CmplxRoots.jl is licensed under the GNU Lesser General Public 
License version 3 or any later version, as well as under a "customary 
scientific license", which implies that if this code was important in the 
scientific process or for the results of your scientific work, you are 
asked for the appropriate citation of the paper Skowron & Gould 2012 (
http://arxiv.org/abs/1203.1034).

Happy root-finding,
Mosè


Re: [julia-users] Re: [ANN] Cuba.jl: library for multidimensional numerical integration

2016-04-13 Thread Mosè Giordano
Hi Kaj,

thanks for your question, it served me as inspiration to add another
example to the manual:
https://cubajl.readthedocs.org/en/latest/#passing-data-to-the-integrand-function

Bye,
Mosè


2016-04-13 5:14 GMT+02:00 Steven G. Johnson :

>
>
> On Tuesday, April 12, 2016 at 4:51:07 PM UTC-4, Kaj Wiik wrote:
>>
>> Very nice work indeed!
>>
>> Is it possible to pass 'userdata' to integrand in Julia? It is mentioned
>> in sources but is not listed in keywords.
>>
>
> You don't need an explicit userdata argument -- this is just a crude
> simulation of closures in C, but Julia has true closures and lexical
> scoping.
>
> For example
>
>
> myvar = 7
> Vegas( (x,f) -> f[1] = log(x[1] + myvar)/sqrt(x[1]), 1, 1)
>
>
> will "capture" the value of myvar and pass it through Vegas to the
> integrand, via the closure.
>


Re: [julia-users] Re: [ANN] Cuba.jl: library for multidimensional numerical integration

2016-04-13 Thread Mosè Giordano
Hi Oliver,

2016-04-12 21:38 GMT+02:00 Oliver Schulz :
> Very nice, Mose! I thought I might have to write a Julia wrapper for Cuba at
> some point, but you beat me to it (which I'm grateful for!). :-)

Thanks!

> I noticed that you're maintaining a modified version of Cuba for this - I
> guess because the upstream Cuba doesn't support "make shared"?

Yes, exactly, these are the differences with vanilla Cuba Library:
https://github.com/giordano/cuba/compare/julia  Only configure and
makefile are involved.

> Are you in
> contact with Thomas? He may be willing to add that - if you like, I can ask,
> his office is two floors up from mine. ;-)

I had been in contact with him in the past, and I think I'll contact
him very soon again.

Ciao,
Mosè


Re: [julia-users] Re: [ANN] Cuba.jl: library for multidimensional numerical integration

2016-04-13 Thread Mosè Giordano
Hi Chris,

thanks for your comments :-)  Regarding integration algorithm, unless you
have specific reasons to use Vegas I warmly recommend to use Cuhre, which
in my experience is the best in terms of speed and precision.  There must
be a reason if Cubature,jl, the only widely used numerical integration
packages so far, implements the same algorithm ;-)  Only once I found
Divonne more useful because I had a wild function with peaks in known
positions and you can tell Divonne to increase sampling around those points.

Bye,
Mosè


2016-04-12 19:08 GMT+02:00 Chris Rackauckas :

> Nice! I was looking for a Vegas function awhile ago and the GSL.jl one
> isn't bound correctly yet. This will be a nice addition to the Julia
> package list. Good job!
>
> On Sunday, April 10, 2016 at 1:34:53 PM UTC-7, Mosè Giordano wrote:
>>
>> Dear all,
>>
>> I am proud to announce Cuba.jl <https://github.com/giordano/Cuba.jl> a
>> library for multidimensional numerical integration with four independent
>> algorithms: Vegas, Suave, Divonne, and Cuhre (this algorithm is the same
>> used in Cubature.jl).  This package is just a wrapper around Cuba Library
>> <http://www.feynarts.de/cuba/>, written in C by Thomas Hahn.
>>
>> Cuba.jl is a registered Julia package, so you can install it with the
>> package manager:
>>
>> Pkg.add("Cuba")
>>
>> The package is usable, but I must admit user interface is not optimal.
>> One has to define a function of this type:
>>
>> function integrand(ndim::Cint, xx::Ptr{Cdouble}, ncomp::Cint, ff::Ptr{
>> Cdouble},
>>userdata::Ptr{Void})
>> # Take arrays from "xx" and "ff" pointers.
>> x = pointer_to_array(xx, (ndim,))
>> f = pointer_to_array(ff, (ncomp,))
>> # Do calculations on "f" here
>> #   ...
>> # Store back the results to "ff"
>> ff = pointer_from_objref(f)
>> return Cint(0)::Cint
>> end
>>
>> and then call one of the four integrator functions available with this
>> syntax:
>>
>> Vegas(integrand, ndim, ncomp[, keywords...])
>> Suave(integrand, ndim, ncomp[, keywords...])
>> Divonne(integrand, ndim, ncomp[, keywords...])
>> Cuhre(integrand, ndim, ncomp[, keywords...])
>>
>> Issue #3 <https://github.com/giordano/Cuba.jl/issues/3> tracks this
>> problem, if someone wants to help on this is warmly welcome.
>>
>> Documentation of the package is available at
>> https://cubajl.readthedocs.org/ and you can also download the PDF
>> version of the manual from
>> https://media.readthedocs.org/pdf/cubajl/latest/cubajl.pdf  In
>> particular, there is a section with some useful examples:
>> https://cubajl.readthedocs.org/en/latest/#examples
>>
>> Even though Cuba.jl does not support parallelization (see issue #1
>> <https://github.com/giordano/Cuba.jl/issues/1>), its performance is
>> comparable with those of equivalent codes written in C or Fortran relying
>> on Cuba Library: https://github.com/giordano/Cuba.jl#performance
>>
>> Cuba.jl is released under the terms of LGPLv3 and is available for
>> GNU/Linux and OS X (Windows support is currently missing
>> <https://github.com/giordano/Cuba.jl/issues/2>, Cubature.jl is a better
>> alternative for that platform).
>>
>> Feel free to share your comments and suggestions on this package!
>>
>> Cheers,
>> Mosè
>>
>


[julia-users] Re: [ANN] Cuba.jl: library for multidimensional numerical integration

2016-04-12 Thread Mosè Giordano
A quick update: with new version 0.0.5 the syntax of the integrand function 
has been changed and now the integrand function should be passed as in 
Cubature,jl, for the case of vector-valued integrands.  So, for example, to 
integrate log(x)/sqrt(x) for x in [0, 1], one can use one of the following 
commands:

Vegas( (x,f) -> f[1] = log(x[1])/sqrt(x[1]), 1, 1)
Suave( (x,f) -> f[1] = log(x[1])/sqrt(x[1]), 1, 1)
Divonne( (x,f) -> f[1] = log(x[1])/sqrt(x[1]), 1, 1)
Cuhre( (x,f) -> f[1] = log(x[1])/sqrt(x[1]), 1, 1)

More details and examples are available in the manual: 
http://cubajl.readthedocs.org/

Let me thank Steven G. Johnson for his help.

Bye,
Mosè


[julia-users] [ANN] Cuba.jl: library for multidimensional numerical integration

2016-04-10 Thread Mosè Giordano
Dear all,

I am proud to announce Cuba.jl  a 
library for multidimensional numerical integration with four independent 
algorithms: Vegas, Suave, Divonne, and Cuhre (this algorithm is the same 
used in Cubature.jl).  This package is just a wrapper around Cuba Library 
, written in C by Thomas Hahn.

Cuba.jl is a registered Julia package, so you can install it with the 
package manager:

Pkg.add("Cuba")

The package is usable, but I must admit user interface is not optimal.  One 
has to define a function of this type:

function integrand(ndim::Cint, xx::Ptr{Cdouble}, ncomp::Cint, ff::Ptr{
Cdouble},
   userdata::Ptr{Void})
# Take arrays from "xx" and "ff" pointers.
x = pointer_to_array(xx, (ndim,))
f = pointer_to_array(ff, (ncomp,))
# Do calculations on "f" here
#   ...
# Store back the results to "ff"
ff = pointer_from_objref(f)
return Cint(0)::Cint
end

and then call one of the four integrator functions available with this 
syntax:

Vegas(integrand, ndim, ncomp[, keywords...])
Suave(integrand, ndim, ncomp[, keywords...])
Divonne(integrand, ndim, ncomp[, keywords...])
Cuhre(integrand, ndim, ncomp[, keywords...])

Issue #3  tracks this 
problem, if someone wants to help on this is warmly welcome.

Documentation of the package is available at https://cubajl.readthedocs.org/ 
and you can also download the PDF version of the manual from 
https://media.readthedocs.org/pdf/cubajl/latest/cubajl.pdf  In particular, 
there is a section with some useful examples: 
https://cubajl.readthedocs.org/en/latest/#examples

Even though Cuba.jl does not support parallelization (see issue #1 
), its performance is 
comparable with those of equivalent codes written in C or Fortran relying 
on Cuba Library: https://github.com/giordano/Cuba.jl#performance

Cuba.jl is released under the terms of LGPLv3 and is available for 
GNU/Linux and OS X (Windows support is currently missing 
, Cubature.jl is a better 
alternative for that platform).

Feel free to share your comments and suggestions on this package!

Cheers,
Mosè