[julia-users] Re: Automatic differentiation of mapping from R^N to R^N?

2015-12-01 Thread Jarrett Revels
The restriction that one needs to use generic functions (and as such, can't 
use explicit BLAS calls) isn't clearly stated in ForwardDiff.jl's 
documentation; I just made an issue to track documenting it 
. Feel free to open 
up an issue on ForwardDiff.jl's repository if you run into further errors 
or unexpected performance bugs.

Best,
Jarrett

On Tuesday, December 1, 2015 at 8:07:28 AM UTC-5, Patrick Kofod Mogensen 
wrote:
>
> I know help is much easier to provide with a MWE, but it must be the 
> explicit BLAS-calls, I'm sure. I will re-write the function using Julia 
> functions only. If it doesn't work, I'll have to try to reduce my mapping 
> to a simpler version I can show here. Thanks for taking your time so far.
>
> On Tuesday, December 1, 2015 at 1:23:16 PM UTC+1, Kristoffer Carlsson 
> wrote:
>>
>> Do you have explicit blas calls? That won't work. Instead you should just 
>> use the more high level functions (either * or the Ax_mul_Bx versions) so 
>> that Julias multiple dispatch can do its thing and run the generic versions 
>> (julia implementations) of the functions when it is called with "graident 
>> numbers".
>>
>> Easiest would be if you could post a small runnable example of something 
>> that models what you actually want to do.
>>
>

Re: [julia-users] Re: Hyper-Dual numbers

2015-10-05 Thread Jarrett Revels
Just a note to avoid confusion: ForwardDiff.jl doesn't use complex numbers 
at all. It uses its own implementation of dual number/hyper-dual number 
"ensembles" for computing multiple partial derivatives simultaneously. 
There is no additional error term in ForwardDiff.jl's formulation.

-- Jarrett

On Tuesday, September 29, 2015 at 12:21:26 AM UTC-4, Rob J Goedman wrote:
>
> Hi Nitin,
>
> Thanks for the note and references. The recent upgrade of ForwardDiff.jl 
> certainly shows the value of these methods.
>
> Did you (re-)implement MultiComplexNumbers from the 2nd reference, or used 
> one of the two (Fortran and Matlab) prototype modules? Did you use it from 
> within Julia? Do you know the license that comes with the implementation 
> you used? 
>
> Sorry for all the questions!
>
> Regards,
> Rob
>
>
> On Sep 28, 2015, at 18:11, Nitin Arora > 
> wrote:
>
> There is another method to calculate any order derivatives, very similar 
> to this one, known as Multi-complex differentiation. This method can 
> calculate derivatives up-to any order.
>
>
> http://www.autodiff.org/Docs/euroad/13rd%20EuroAd%20Workshop%20-%20Thierry%20Dargent%20-%20Using%20Multicomplex%20Variables%20for%20Automatic%20Computation%20of%20High-Order%20Derivatives.pdf
>
>
> http://delivery.acm.org/10.1145/217/2168774/a16-lantoine.pdf?ip=207.151.221.1&id=2168774&acc=ACTIVE%20SERVICE&key=6405B83BDA580DC2%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35&CFID=717380051&CFTOKEN=25446753&__acm__=1443489222_ee3f3d4c7ee41030aaba536e14d6a6ab
>
> I have used this in my field and it seems to work nicely.
>
> thanks,
> Nitin
>
> On Saturday, March 29, 2014 at 5:43:05 PM UTC-7, Rob J Goedman wrote:
>>
>> Hi,
>>
>> As a first 'jump into the fray' exercise I've attempted to translate 
>> Jeffrey Fike's hyper-dual numbers code from c++ to Julia, more or less 
>> following the DualNumbers package.
>>
>> The c++ code can be found at 
>> http://adl.stanford.edu/hyperdual/hyperdual.h . The paper itself at 
>> http://adl.stanford.edu/hyperdual/Fike_AIAA-2011-886.pdf .
>>
>> The Julia package can be found at: 
>> https://github.com/goedman/HyperDualNumbers.jl.git .
>>
>> Of course, I'm pretty new at this so I'm sure there will be errors and 
>> poor practices. So any feedback is appreciated.
>>
>> Also, I'm wondering if the type should be called Hyper or a better name 
>> would be HyperDual.
>>
>> This work was triggered by the interesting threads around openPP, 
>> TaylorSeries.jl, Calculus2, PowerSeries.jl (and at some time I hope 
>> MCMC.jl).
>>
>> Rob J. Goedman
>> goe...@icloud.com
>>
>>
>>
>>
>>

Re: [julia-users] META: What are the chances of moving this forum?

2015-09-10 Thread Jarrett Revels

>
> -1 to re-purposing GitHub issues for any sort of free-form discussion. I 
> don't see that as solving any problems. At scale, the GitHub issue tracker 
> is limited even for the intended purpose.
>

The main reason I mentioned it was that it was a solid example of the kind 
of features I had always felt Google Groups lacked. But...

I do like Discourse because of the auto-suggestion, tiered moderation, 
> proper threading, categorization, and other features that would (hopefully) 
> *cut down* on the number of pings we see. Discourse also has at least 
> one-way GitHub cross-linking (mentioning a GitHub issue in discourse will 
> expand to a short summary, IIRC).
>

...that basically sums up everything I'd want in a new mailing list. I just 
gave their sandbox  a whirl, and it looks 
awesome, especially the onebox feature (that would be *really* great for 
seeing where commits/PRs lead without having to click on them). Discourse 
has my vote, if the idea of moving the mailing list ever takes flight.

On Thursday, September 10, 2015 at 1:37:01 PM UTC-4, Miguel Bazdresch wrote:
>
> Personally, I'd prefer to just have an old-fashioned, LISTERV-type mailing 
> list. Yes, I'm old.
>
> -- mb
>
> On Wed, Sep 9, 2015 at 7:34 AM, Nils Gudat  > wrote:
>
>> Was just thinking about this as first I had to try opening a thread here 
>> a couple of times before any posts were actually displayed, and then read 
>> through this 
>> 
>>  
>> thread on Mike's Juno Discuss forum, where a user had a question which was 
>> almost answered by the nice auto-suggestion feature that pops up when you 
>> ask a question.
>>
>> I feel that Google Groups has mostly downsides - the system is slow, with 
>> posts frequently not loading, double posts because of this, no proper code 
>> formatting or Markdown support etc.
>>
>> Is there a chance this could be moved to something like Discuss or is 
>> there too much inertia in having an "established" forum on Google Groups?
>>
>
>

Re: [julia-users] Re: META: What are the chances of moving this forum?

2015-09-10 Thread Jarrett Revels

>
> I think you may have misunderstood my comment: I suggested that one of 
> the larger Julia projects, e.g. Julia-Stats, Juno, or Gadfly, would try 
> this out and report back.
>

Oh yeah, sorry, that makes more sense haha.

On Thursday, September 10, 2015 at 9:15:53 AM UTC-4, Mauro wrote:
>
> >> Maybe it would be best if this was 
> >> tried out with a large Julia project to see the ins and outs? 
> >> Volunteers? 
> > 
> > I imagine that the only way that a tentative switch to a "JuliaUsers" 
> repo 
> > on Github would gain any significant traction is if the repo was under 
> the 
> > JuliaLang org. 
>
> I think you may have misunderstood my comment: I suggested that one of 
> the larger Julia projects, e.g. Julia-Stats, Juno, or Gadfly, would try 
> this out and report back. 
>
> But yes, the JuliaUsers would have to be part of JuliaLang. 
>
> > On Thursday, September 10, 2015 at 5:15:12 AM UTC-4, Mauro wrote: 
> >> 
> >> >> Do any of these other suggestions have a mailing list interface? 
> >> > 
> >> > GitHub's issue tracking interface does. 
> >> 
> >> In my email client the threading doesn't seem to work well with github 
> >> issue-emails, so I end up using the web interface most of the time. 
> >> Otherwise it might work well.  I had a quick look around on the 
> >> internet, not many projects seem to use issues as mailing list.  I only 
> >> found these: 
> >> https://lists.w3.org/Archives/Public/public-houdini/2015Sep/0004.html 
> >> https://github.com/hapijs/discuss/issues 
> >> 
> >> @-notifications could be quite nice.  Maybe it would be best if this 
> was 
> >> tried out with a large Julia project to see the ins and outs? 
> >> Volunteers? 
> >> 
> >> > On Wednesday, September 9, 2015 at 3:59:51 PM UTC-4, Robert DJ wrote: 
> >> >> 
> >> >> I'm not a fan of Google, so I think it would great to find another 
> >> forum. 
> >> >> Besides, I fully agree with the technical issues mentioned. 
> >> >> 
> >> >> Regarding the need for user accounts on e.g. GitHub, I don't see how 
> >> that 
> >> >> is different from requiring a Google account for participating 
> here... 
> >> >> 
> >> >> On Wednesday, September 9, 2015 at 5:33:33 PM UTC+2, Jarrett Revels 
> >> wrote: 
> >> >>> 
> >> >>> I often find myself wishing that this forum was just a GitHub repo, 
> >> with 
> >> >>> posts as issues. It would be great for supporting @-style mentions, 
> >> >>> markdown, issue-linking, etc. The main downside would be forcing 
> >> everybody 
> >> >>> to have GitHub account (not bad IMO, but I can see others having an 
> >> issue 
> >> >>> for it). 
> >> >>> 
> >> >>> It would probably be a bad idea to upheave the forums, anyway, 
> >> especially 
> >> >>> at such a critical moment in Julia's development (v0.4 getting 
> >> released 
> >> >>> soon). I doubt this is going to happen. 
> >> >>> 
> >> >>> On Wednesday, September 9, 2015 at 7:34:58 AM UTC-4, Nils Gudat 
> wrote: 
> >> >>>> 
> >> >>>> Was just thinking about this as first I had to try opening a 
> thread 
> >> here 
> >> >>>> a couple of times before any posts were actually displayed, and 
> then 
> >> read 
> >> >>>> through this 
> >> >>>> < 
> >> 
> http://discuss.junolab.org/t/how-to-let-julia-be-the-boss-of-juno-windows-windows-issue/320>
>  
>
> >> 
> >> >>>> thread on Mike's Juno Discuss forum, where a user had a question 
> >> which was 
> >> >>>> almost answered by the nice auto-suggestion feature that pops up 
> when 
> >> you 
> >> >>>> ask a question. 
> >> >>>> 
> >> >>>> I feel that Google Groups has mostly downsides - the system is 
> slow, 
> >> >>>> with posts frequently not loading, double posts because of this, 
> no 
> >> proper 
> >> >>>> code formatting or Markdown support etc. 
> >> >>>> 
> >> >>>> Is there a chance this could be moved to something like Discuss or 
> is 
> >> >>>> there too much inertia in having an "established" forum on Google 
> >> Groups? 
> >> >>>> 
> >> >>> 
> >> 
> >> 
>
>

Re: [julia-users] Re: META: What are the chances of moving this forum?

2015-09-10 Thread Jarrett Revels

>
> Maybe it would be best if this was
> tried out with a large Julia project to see the ins and outs?
> Volunteers?


I imagine that the only way that a tentative switch to a "JuliaUsers" repo 
on Github would gain any significant traction is if the repo was under the 
JuliaLang org.

On Thursday, September 10, 2015 at 5:15:12 AM UTC-4, Mauro wrote:
>
> >> Do any of these other suggestions have a mailing list interface? 
> > 
> > GitHub's issue tracking interface does. 
>
> In my email client the threading doesn't seem to work well with github 
> issue-emails, so I end up using the web interface most of the time. 
> Otherwise it might work well.  I had a quick look around on the 
> internet, not many projects seem to use issues as mailing list.  I only 
> found these: 
> https://lists.w3.org/Archives/Public/public-houdini/2015Sep/0004.html 
> https://github.com/hapijs/discuss/issues 
>
> @-notifications could be quite nice.  Maybe it would be best if this was 
> tried out with a large Julia project to see the ins and outs? 
> Volunteers? 
>
> > On Wednesday, September 9, 2015 at 3:59:51 PM UTC-4, Robert DJ wrote: 
> >> 
> >> I'm not a fan of Google, so I think it would great to find another 
> forum. 
> >> Besides, I fully agree with the technical issues mentioned. 
> >> 
> >> Regarding the need for user accounts on e.g. GitHub, I don't see how 
> that 
> >> is different from requiring a Google account for participating here... 
> >> 
> >> On Wednesday, September 9, 2015 at 5:33:33 PM UTC+2, Jarrett Revels 
> wrote: 
> >>> 
> >>> I often find myself wishing that this forum was just a GitHub repo, 
> with 
> >>> posts as issues. It would be great for supporting @-style mentions, 
> >>> markdown, issue-linking, etc. The main downside would be forcing 
> everybody 
> >>> to have GitHub account (not bad IMO, but I can see others having an 
> issue 
> >>> for it). 
> >>> 
> >>> It would probably be a bad idea to upheave the forums, anyway, 
> especially 
> >>> at such a critical moment in Julia's development (v0.4 getting 
> released 
> >>> soon). I doubt this is going to happen. 
> >>> 
> >>> On Wednesday, September 9, 2015 at 7:34:58 AM UTC-4, Nils Gudat wrote: 
> >>>> 
> >>>> Was just thinking about this as first I had to try opening a thread 
> here 
> >>>> a couple of times before any posts were actually displayed, and then 
> read 
> >>>> through this 
> >>>> <
> http://discuss.junolab.org/t/how-to-let-julia-be-the-boss-of-juno-windows-windows-issue/320>
>  
>
> >>>> thread on Mike's Juno Discuss forum, where a user had a question 
> which was 
> >>>> almost answered by the nice auto-suggestion feature that pops up when 
> you 
> >>>> ask a question. 
> >>>> 
> >>>> I feel that Google Groups has mostly downsides - the system is slow, 
> >>>> with posts frequently not loading, double posts because of this, no 
> proper 
> >>>> code formatting or Markdown support etc. 
> >>>> 
> >>>> Is there a chance this could be moved to something like Discuss or is 
> >>>> there too much inertia in having an "established" forum on Google 
> Groups? 
> >>>> 
> >>> 
>
>

[julia-users] Re: META: What are the chances of moving this forum?

2015-09-09 Thread Jarrett Revels

>
> Do any of these other suggestions have a mailing list interface?


GitHub's issue tracking interface does.

On Wednesday, September 9, 2015 at 3:59:51 PM UTC-4, Robert DJ wrote:
>
> I'm not a fan of Google, so I think it would great to find another forum. 
> Besides, I fully agree with the technical issues mentioned.
>
> Regarding the need for user accounts on e.g. GitHub, I don't see how that 
> is different from requiring a Google account for participating here...
>
> On Wednesday, September 9, 2015 at 5:33:33 PM UTC+2, Jarrett Revels wrote:
>>
>> I often find myself wishing that this forum was just a GitHub repo, with 
>> posts as issues. It would be great for supporting @-style mentions, 
>> markdown, issue-linking, etc. The main downside would be forcing everybody 
>> to have GitHub account (not bad IMO, but I can see others having an issue 
>> for it).
>>
>> It would probably be a bad idea to upheave the forums, anyway, especially 
>> at such a critical moment in Julia's development (v0.4 getting released 
>> soon). I doubt this is going to happen.
>>
>> On Wednesday, September 9, 2015 at 7:34:58 AM UTC-4, Nils Gudat wrote:
>>>
>>> Was just thinking about this as first I had to try opening a thread here 
>>> a couple of times before any posts were actually displayed, and then read 
>>> through this 
>>> <http://discuss.junolab.org/t/how-to-let-julia-be-the-boss-of-juno-windows-windows-issue/320>
>>>  
>>> thread on Mike's Juno Discuss forum, where a user had a question which was 
>>> almost answered by the nice auto-suggestion feature that pops up when you 
>>> ask a question.
>>>
>>> I feel that Google Groups has mostly downsides - the system is slow, 
>>> with posts frequently not loading, double posts because of this, no proper 
>>> code formatting or Markdown support etc.
>>>
>>> Is there a chance this could be moved to something like Discuss or is 
>>> there too much inertia in having an "established" forum on Google Groups?
>>>
>>

[julia-users] Re: META: What are the chances of moving this forum?

2015-09-09 Thread Jarrett Revels
I often find myself wishing that this forum was just a GitHub repo, with 
posts as issues. It would be great for supporting @-style mentions, 
markdown, issue-linking, etc. The main downside would be forcing everybody 
to have GitHub account (not bad IMO, but I can see others having an issue 
for it).

It would probably be a bad idea to upheave the forums, anyway, especially 
at such a critical moment in Julia's development (v0.4 getting released 
soon). I doubt this is going to happen.

On Wednesday, September 9, 2015 at 7:34:58 AM UTC-4, Nils Gudat wrote:
>
> Was just thinking about this as first I had to try opening a thread here a 
> couple of times before any posts were actually displayed, and then read 
> through this 
> 
>  
> thread on Mike's Juno Discuss forum, where a user had a question which was 
> almost answered by the nice auto-suggestion feature that pops up when you 
> ask a question.
>
> I feel that Google Groups has mostly downsides - the system is slow, with 
> posts frequently not loading, double posts because of this, no proper code 
> formatting or Markdown support etc.
>
> Is there a chance this could be moved to something like Discuss or is 
> there too much inertia in having an "established" forum on Google Groups?
>


[julia-users] Re: [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-08 Thread Jarrett Revels

>
> For the latter, you would need to be able to take linear combinations of 
> epsilons. Is that currently possible?
>

If I correctly understand what you're saying, then yes. See the Types 
section of the notebook I previously linked 
 (the 
types described will be documented in more detail soon).

On Sunday, September 6, 2015 at 7:14:09 PM UTC-4, Eric Forgy wrote:
>
> I like this. I think AD can be extended in a fairly straightforward manner 
> to stochastic differentials, e.g. Ito formula. Has anybody looked into 
> this? That could be interesting for finance applications.
>
> This could also be interesting for use in other differential algebras. In 
> particular, extending it to higher degree differential forms could be nice.
>
> For the latter, you would need to be able to take linear combinations of 
> epsilons. Is that currently possible?
>
> For example,
>
> e = e1 + e2
>
> e^2 = e1*e2 + e2*e1 = 0
>
> => 
>
> e1*e2 = -e2*e1
>
> That would be cool.
>
>

[julia-users] Re: [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-05 Thread Jarrett Revels

>
>  I assume it works with multi-dimensional functions f(x,y,z)?
>

Yes, in a sense. ForwardDiff only accepts univariate functions, but the 
single argument to those functions can be a Number or Vector. So, if you 
wanted to the gradient of the f you gave, and x, y, z are scalars, you 
could rewrite f to accept a Vector and do:

julia> ForwardDiff.gradient(f, [x, y, z])
 
Otherwise, if you wanted to do something like differentiate w.r.t. 
different arguments, you could simply use closures (see my comment in this 
issue 

).

What are the limitations to it?
>

ForwardDiff supports up to a third-order generalization of the Hessian, but 
you can nest the results to go to higher orders. The usage example given in 
the README  
demonstrates 
nested derivatives in ForwardDiff (in that specific example, we take the 
Jacobian of the gradient and then check its equivalence with the Hessian).

ForwardDiff also only supports differentiation of functions where the 
involved number types are subtypes of Real. I think generalization to 
Complex numbers is possible, but we haven't played with it yet.

Where would you still use analytic derivatives?
>

If you mean hard-coding the known derivative of a function, I imagine that, 
in most cases, you'll be able to write a faster function by hand than 
ForwardDiff can generate. But analytic higher-order expressions can 
sometimes be tough and time consuming to code efficiently. I would say that 
ForwardDiff is worth trying out in any situation that normally calls for 
numerical differentiation.

Best,
Jarrett

On Saturday, September 5, 2015 at 1:16:36 PM UTC-4, Michael Prentiss wrote:
>
> This looks very impressive.  I assume it works with multi-dimensional 
> functions f(x,y,z)?
>
> It also looks very fast.  What are the limitations to it?  Where would you 
> still use analytic derivatives?
>


[julia-users] Re: [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-05 Thread Jarrett Revels
Thanks! I'm going to be working this upcoming week on adding developer docs 
to our current documentation  to 
explain some of the behind-the-scenes details; I'll post here once that's 
completed. In the meantime, the notebook I posted earlier 
 gives a 
surface-level explanation of the way the package works (useful info starts 
at the "Types" section). That notebook was for an AD conference, so it 
presupposes knowledge of dual numbers and hyper-dual numbers; a good intro 
paper for the subject can be found here 
.

On Saturday, September 5, 2015 at 5:21:19 AM UTC-4, Michael Francis wrote:
>
> This is very impressive. Is there a description of the Julia mechanics at 
> work here? The implementation looks very clean. 



[julia-users] Re: [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-04 Thread Jarrett Revels
D'oh! Forgot to include the line:

   julia> x = rand(1);

On Friday, September 4, 2015 at 11:14:22 AM UTC-4, Jarrett Revels wrote:
>
> Are there any benchmark results for the "more performant and accurate" bit?
>>
>
> Very valid question! There are some benchmarks here, near the bottom of 
> this file 
> <https://github.com/mlubin/EuroAD2015/blob/master/forwarddiff.ipynb>. The 
> code used to run those benchmarks can be found here 
> <https://github.com/JuliaDiff/ForwardDiff.jl/tree/master/benchmarks>. 
> Those benchmarks are actually old, though - the current release should 
> actually be quite a bit faster than the version we ran those benchmarks 
> against (they were actually taken right before we fixed a major performance 
> regression - updating them is on my to-do list). 
>
> You may notice that we only really compare against other automatic 
> differentiation (AD) tools. That's because AD encompasses a whole different 
> class of techniques then what's referred to as "symbolic" or "numeric" 
> differentiation; results of AD algorithms are generally within machine 
> epsilon precision of the "real" answers, and these algorithms have provably 
> faster runtime complexities (e.g. potentially O(1) evaluations of the 
> target function using AD vs O(n) evaluations of the target function using 
> finite differencing). Thus, the benchmarks don't compare against non-AD 
> methods as it's not a very useful comparison for how fast the package is; 
> the non-AD methods are generally provably slower. 
>
> That being said, we should probably start benchmarking against traditional 
> methods just to track performance regressions, and to win over people who 
> haven't yet encountered AD. In that spirit, here's a benchmark comparing 
> Calculus's gradient method with ForwardDiff's:
>
> julia> function ackley(x)
>a, b, c = 20.0, -0.2, 2.0*π
>len_recip = inv(length(x))
>sum_sqrs = zero(eltype(x))
>sum_cos = sum_sqrs
>for i in x
>sum_cos += cos(c*i)
>sum_sqrs += i^2
>end
>return (-a * exp(b * sqrt(len_recip*sum_sqrs)) -
>exp(len_recip*sum_cos) + a + e)
>end
> ackley (generic function with 1 method)
>
> julia> calc_g = Calculus.gradient(ackley);
>
> julia> @time calc_g(x);
>   4.379173 seconds (69.50 k allocations: 1.137 MB)
>
> julia> fd_g = ForwardDiff.gradient(ackley, chunk_size=10);
>
> julia> @time fd_g(x);
>   0.523414 seconds (2.02 k allocations: 266.266 KB)
>
> Best,
> Jarrett
>
> On Friday, September 4, 2015 at 5:27:27 AM UTC-4, Johan Sigfrids wrote:
>>
>> Are there any benchmark results for the "more performant and accurate" 
>> bit?
>>
>> On Thursday, September 3, 2015 at 11:25:01 PM UTC+3, Jarrett Revels wrote:
>>>
>>> I'm proud to announce that we've tagged and released a new version 
>>> ForwardDiff.jl (https://github.com/JuliaDiff/ForwardDiff.jl).
>>>
>>> ForwardDiff.jl is a package for performing automatic differentiation 
>>> <https://en.wikipedia.org/wiki/Automatic_differentiation> on native 
>>> Julia functions/callable objects. The techniques used by this package *are 
>>> more performant and accurate than other standard algorithms for 
>>> differentiation*, so if taking derivatives is something you're at all 
>>> interested in, I suggest you give ForwardDiff.jl a try!
>>>
>>> If you don't already have the package, you can install it with Julia's 
>>> package manager by running the following:
>>>
>>> julia> Pkg.update(); Pkg.add("ForwardDiff")
>>>
>>> If you already have the old version of ForwardDiff.jl, you can update it 
>>> to the new one by simply running Pkg.update().
>>>
>>> Note that *the new version of ForwardDiff.jl only supports Julia v0.4. 
>>> *Julia 
>>> v0.3 users will have to stick to the old version of the package. Also note 
>>> that *the new version introduces some breaking changes*, so you'll 
>>> probably have to rewrite any old ForwardDiff.jl code you have (I promise 
>>> it'll be worth it).
>>>
>>> I've spent a good chunk of the summer overhauling it as part of my Julia 
>>> Summer of Code project, so I hope other folks will find it useful. As 
>>> always, opening issues and pull requests in ForwardDiff.jl's GitHub repo is 
>>> very welcome.
>>>
>>> I'd like to thank Julia Computing, NumFocus, and The Betty and Gordon 
>>> Moore Foundation for putting JSoC together. And, of course, I thank Miles 
>>> Lubin, Theo Papamarkou, and a host of other Julians for their invaluable 
>>> guidance and mentorship throughout the project.
>>>
>>> Best,
>>> Jarrett
>>>
>>>
>>>

[julia-users] Re: [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-04 Thread Jarrett Revels

>
> Are there any benchmark results for the "more performant and accurate" bit?
>

Very valid question! There are some benchmarks here, near the bottom of 
this file 
<https://github.com/mlubin/EuroAD2015/blob/master/forwarddiff.ipynb>. The 
code used to run those benchmarks can be found here 
<https://github.com/JuliaDiff/ForwardDiff.jl/tree/master/benchmarks>. Those 
benchmarks are actually old, though - the current release should actually 
be quite a bit faster than the version we ran those benchmarks against 
(they were actually taken right before we fixed a major performance 
regression - updating them is on my to-do list). 

You may notice that we only really compare against other automatic 
differentiation (AD) tools. That's because AD encompasses a whole different 
class of techniques then what's referred to as "symbolic" or "numeric" 
differentiation; results of AD algorithms are generally within machine 
epsilon precision of the "real" answers, and these algorithms have provably 
faster runtime complexities (e.g. potentially O(1) evaluations of the 
target function using AD vs O(n) evaluations of the target function using 
finite differencing). Thus, the benchmarks don't compare against non-AD 
methods as it's not a very useful comparison for how fast the package is; 
the non-AD methods are generally provably slower. 

That being said, we should probably start benchmarking against traditional 
methods just to track performance regressions, and to win over people who 
haven't yet encountered AD. In that spirit, here's a benchmark comparing 
Calculus's gradient method with ForwardDiff's:

julia> function ackley(x)
   a, b, c = 20.0, -0.2, 2.0*π
   len_recip = inv(length(x))
   sum_sqrs = zero(eltype(x))
   sum_cos = sum_sqrs
   for i in x
   sum_cos += cos(c*i)
   sum_sqrs += i^2
   end
   return (-a * exp(b * sqrt(len_recip*sum_sqrs)) -
   exp(len_recip*sum_cos) + a + e)
   end
ackley (generic function with 1 method)

julia> calc_g = Calculus.gradient(ackley);

julia> @time calc_g(x);
  4.379173 seconds (69.50 k allocations: 1.137 MB)

julia> fd_g = ForwardDiff.gradient(ackley, chunk_size=10);

julia> @time fd_g(x);
  0.523414 seconds (2.02 k allocations: 266.266 KB)

Best,
Jarrett

On Friday, September 4, 2015 at 5:27:27 AM UTC-4, Johan Sigfrids wrote:
>
> Are there any benchmark results for the "more performant and accurate" bit?
>
> On Thursday, September 3, 2015 at 11:25:01 PM UTC+3, Jarrett Revels wrote:
>>
>> I'm proud to announce that we've tagged and released a new version 
>> ForwardDiff.jl (https://github.com/JuliaDiff/ForwardDiff.jl).
>>
>> ForwardDiff.jl is a package for performing automatic differentiation 
>> <https://en.wikipedia.org/wiki/Automatic_differentiation> on native 
>> Julia functions/callable objects. The techniques used by this package *are 
>> more performant and accurate than other standard algorithms for 
>> differentiation*, so if taking derivatives is something you're at all 
>> interested in, I suggest you give ForwardDiff.jl a try!
>>
>> If you don't already have the package, you can install it with Julia's 
>> package manager by running the following:
>>
>> julia> Pkg.update(); Pkg.add("ForwardDiff")
>>
>> If you already have the old version of ForwardDiff.jl, you can update it 
>> to the new one by simply running Pkg.update().
>>
>> Note that *the new version of ForwardDiff.jl only supports Julia v0.4. 
>> *Julia 
>> v0.3 users will have to stick to the old version of the package. Also note 
>> that *the new version introduces some breaking changes*, so you'll 
>> probably have to rewrite any old ForwardDiff.jl code you have (I promise 
>> it'll be worth it).
>>
>> I've spent a good chunk of the summer overhauling it as part of my Julia 
>> Summer of Code project, so I hope other folks will find it useful. As 
>> always, opening issues and pull requests in ForwardDiff.jl's GitHub repo is 
>> very welcome.
>>
>> I'd like to thank Julia Computing, NumFocus, and The Betty and Gordon 
>> Moore Foundation for putting JSoC together. And, of course, I thank Miles 
>> Lubin, Theo Papamarkou, and a host of other Julians for their invaluable 
>> guidance and mentorship throughout the project.
>>
>> Best,
>> Jarrett
>>
>>
>>

[julia-users] [ANN] ForwardDiff.jl v0.1.0 Released

2015-09-03 Thread Jarrett Revels
I'm proud to announce that we've tagged and released a new version 
ForwardDiff.jl (https://github.com/JuliaDiff/ForwardDiff.jl).

ForwardDiff.jl is a package for performing automatic differentiation 
 on native Julia 
functions/callable objects. The techniques used by this package *are more 
performant and accurate than other standard algorithms for differentiation*, 
so if taking derivatives is something you're at all interested in, I 
suggest you give ForwardDiff.jl a try!

If you don't already have the package, you can install it with Julia's 
package manager by running the following:

julia> Pkg.update(); Pkg.add("ForwardDiff")

If you already have the old version of ForwardDiff.jl, you can update it to 
the new one by simply running Pkg.update().

Note that *the new version of ForwardDiff.jl only supports Julia v0.4. *Julia 
v0.3 users will have to stick to the old version of the package. Also note 
that *the new version introduces some breaking changes*, so you'll probably 
have to rewrite any old ForwardDiff.jl code you have (I promise it'll be 
worth it).

I've spent a good chunk of the summer overhauling it as part of my Julia 
Summer of Code project, so I hope other folks will find it useful. As 
always, opening issues and pull requests in ForwardDiff.jl's GitHub repo is 
very welcome.

I'd like to thank Julia Computing, NumFocus, and The Betty and Gordon Moore 
Foundation for putting JSoC together. And, of course, I thank Miles Lubin, 
Theo Papamarkou, and a host of other Julians for their invaluable guidance 
and mentorship throughout the project.

Best,
Jarrett




Re: [julia-users] Re: v0.4 - Pair does not behave like a Tuple

2015-09-03 Thread Jarrett Revels

>
> Though how would one correctly access the 'nth' elements type ?  
> It brings up a question regarding parameterized types, how should (or 
> should) one access/refer to the parameters of the type rather than an 
> instance. It seems like a bad (queezy) practice to dig into the parameters 
> of a type?


It's generally bad practice to access fields of objects that aren't 
"yours." In the case of the keytype function you defined, for example, it 
causes type instability. The more Julian (and type stable way) to define 
such a function is:

keytype{K,V}(::Type{Dict{K,V}}) = K
keytype(dict::Dict) = keytype(typeof(dict))

If you *really* need a general, type stable way to access the nth parameter 
of a type, your best bet will probably be doing something like this:

   @generated function getparam{T,n}(::Type{T}, ::Type{Val{n}})
   P = T.parameters[n]
   return :($P)
   end

Though be warned: naive callers of this method might call it in a type 
unstable manner anyway by passing the index "n" from the value domain 
rather than the type domain.

Best,
Jarrett

On Thursday, September 3, 2015 at 10:04:22 AM UTC-4, Michael Francis wrote:
>
> I will likely do so, it's quite a common action to take. 
>
> Fortunately with typeof( ( 1,2 ) ) == Tuple{Int64,Int64} there is a 
> significantly greater consistency in 0.4 vs 0.3 
>
> Though how would one correctly access the 'nth' elements type ?  
>
> It brings up a question regarding parameterized types, how should (or 
> should) one access/refer to the parameters of the type rather than an 
> instance. It seems like a bad (queezy) practice to dig into the parameters 
> of a type?
>
> On Thursday, September 3, 2015 at 9:41:18 AM UTC-4, Tim Holy wrote:
>>
>> base/dict.jl defines keytype and valtype, but they are not exported. You 
>> could 
>> file a pull request that exports them (it would be a 2-line patch, though 
>> you 
>> might want to add a test to make sure they stay exported). 
>>
>> --Tim 
>>
>> On Thursday, September 03, 2015 06:37:02 AM Michael Francis wrote: 
>> > In the short term I have defined the following in the offending package 
>> for 
>> > v0.4 only 
>> > 
>> > function keytype( dict ) 
>> >  return eltype( dict ).parameters[1] 
>> > end 
>> > 
>> > I agree that a standard protocol of accessing the key and value types 
>> of a 
>> > pair / associative is the way to go. 
>> > 
>> > On Thursday, September 3, 2015 at 9:31:39 AM UTC-4, Matt Bauman wrote: 
>> > > Oh man that's tricky.  The trouble is that you're effectively saying 
>> > > `Pair{Symbol,Int}[1]`, which is the syntax for a typed array: 
>> > > Pair{Symbol,Int}[:x=>1, :y=>2].  One way around this is to define: 
>> > > 
>> > > keytype{A,B}(::Type{Pair{A,B}}) = A 
>> > > valuetype{A,B}(::Type{Pair{A,B}}) = B 
>> > > pairtypes{A,B}(::Type{Pair{A,B}}) = (A,B) 
>> > > 
>> > > If you need this to work on 0.3, too, you can easily make these 
>> functions 
>> > > work for the old-style Tuples, too. 
>> > > 
>> > > On Thursday, September 3, 2015 at 9:06:30 AM UTC-4, Michael Francis 
>> wrote: 
>> > >> Incidentally 
>> > >> 
>> > >> eltype( Pair{String,Float64} ) 
>> > >> 
>> > >> gives Any, that seems slightly strange as well . 
>> > >> 
>> > >> On Thursday, September 3, 2015 at 9:02:33 AM UTC-4, Michael Francis 
>> wrote: 
>> > >>> julia> eltype( Dict( :x => 1, :y => 2 ) )[1] 
>> > >>> ERROR: MethodError: `convert` has no method matching 
>> convert(::Type{Pair 
>> > >>> {Symbol,Int64}}, ::Int64) 
>> > >>> This may have arisen from a call to the constructor 
>> Pair{Symbol,Int64 
>> > >>> }(...), 
>> > >>> since type constructors fall back to convert methods. 
>> > >>> 
>> > >>> Closest candidates are: 
>> > >>>   Pair{A,B}(::Any, ::Any) 
>> > >>>   call{T}(::Type{T}, ::Any) 
>> > >>>   convert{T}(::Type{T}, ::T) 
>> > >>>   
>> > >>>  in getindex at array.jl:167 
>> > >>> 
>> > >>> Is this intentional ? This breaks a package I am dependent on - I 
>> > >>> believe the assumption was that Pair would respect the tuple API, 
>> this 
>> > >>> appears to not be the case ? 
>> > >>> 
>> > >>> collect( eltype( Dict( :x => 1, :y => 2 ) ) ) 
>> > >>> ERROR: MethodError: `start` has no method matching 
>> start(::Type{Pair{ 
>> > >>> Symbol,Int64}}) 
>> > >>> 
>> > >>>  in collect at array.jl:255 
>> > >>>  in collect at array.jl:262 
>>
>>

[julia-users] Re: ANN: JuMP 0.10 released

2015-09-01 Thread Jarrett Revels
Woohoo! That's a lot of new stuff.

-- Jarrett

On Tuesday, September 1, 2015 at 12:41:26 AM UTC-4, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.10.
>
> This is a major release with the greatest amount of new functionality 
> since the addition of nonlinear modeling last year. This will likely be the 
> last major release of JuMP to support Julia 0.3. Thanks to the heroic work 
> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
> *semidefinite 
> programming*.
>
> You can now write, for example:
>
> @defVar(m, x[1:5])
> @addConstraint(m, A*x .== 0)
>
> where A is a Julia matrix (dense or sparse). Note that we require dot 
> comparison operators .== (and similarly .<= and .>=) for vectorized 
> constraints. The vectorized syntax extends to quadratic but not general 
> nonlinear expressions.
>
> An important new concept to keep in mind is that this vectorized syntax 
> only applies to sets of variables which are one-based arrays. If you 
> declare variables indexed by more complicated sets, e.g.,
>
> @defVar(m, y[3:5])
> s = [:cat, :dog, :pizza]
> @defVar(m, z[s])
>
> then dot(y,z) and rand(3,3)*z are undefined. A result of this new concept 
> of one-based arrays is that x above now has the type Vector{JuMP.Variable}. 
> In this case, getValue() now returns a Vector{Float64} instead of an 
> opaque JuMP object. We hope users find this new distinction between 
> one-indexed array variables and all other symbolically indexed variables 
> useful and intuitive (if not, let us know).
>
> For semidefinite modeling, you can declare variables as SDP matrices and 
> add LMI (linear matrix inequality) constraints as illustrated in the 
> examples for minimal ellipse 
> 
>  and 
> max cut 
> ,
>  
> among others.
>
> We also have a *new syntax for euclidean norms:*
>
> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
> # or
> @addConstraint(m, norm(c.*x+b) <= 10)
>
> You may be wondering how JuMP compares with Convex.jl given these new 
> additions. Not much has changed philosophically; JuMP directly translates 
> SDP constraints and euclidean norms into the sparse matrix formats as 
> required by conic solvers. Unlike Convex.jl, *JuMP accepts only 
> standard-form SDP and second-order conic constraints and will not perform 
> any automatic transformations* such as modeling nuclear norms, minimum 
> eigenvalue, geometric mean, rational norms, etc. We would recommend using 
> Convex.jl for easy modeling of such functions. Our focus, for now, is on 
> the large-scale performance and stability of the huge amount of new syntax 
> introduced in this release.
>
> Also notable in this release:
> - JuMP models now store a dictionary of attached variables, so that you 
> can look up a variable from a model by name by using the new getVar() 
> method.
> - On Julia 0.4 only, you can now have a filter variable declarations, e.g.,
> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
> will only create variables for the indices which satisfy the filter 
> condition. (These are not one-based arrays as introduced above.)
> - Dual multipliers are available for nonlinear problems from the solvers 
> which provide them
> - There is improved documentation for querying derivatives from a 
> nonlinear JuMP model 
> 
> - *We now try to print warnings for two common performance traps*: 
> calling getValue() in a tight loop and using operator overloading to 
> construct large JuMP expressions. Please let us know if these are useful or 
> annoying or both so that we can tune the warning thresholds.
> - Thanks to Tony Kelman and Jack Dunn, you can now call a large number of 
> external solvers including Bonmin and Couenne through either the .osil or 
> .nl exchange formats.
> - Module precompilation speeds up using JuMP considerably, for those on 
> Julia 0.4
>
> The delay since the last release of JuMP is mostly due to us trying to 
> test and refine the new syntax, but inevitably some bugs have slipped 
> through, so please let us know of any incorrect or confusing behavior.
>
> Also newsworthy is our new paper  
> describing the methods used in JuMP with benchmark comparisons to existing 
> open-source and commercial optimization modeling software.
>
> Miles, Iain, and Joey
>
>

[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Jarrett Revels
Actually, just saw this: https://github.com/JuliaLang/julia/issues/9818 
<https://github.com/JuliaLang/julia/issues/9818>. Ignore the messed up 
@code_typed stuff in my previous reply to this thread.

I believe the type-inference concerns are still there, however, even if 
@code_typed doesn't correctly report them, so the fixes I listed should 
still be useful for patching over inferencing problems with keyword 
arguments.

Best,
Jarrett

On Tuesday, September 1, 2015 at 12:49:02 PM UTC-4, Jarrett Revels wrote:
>
> Related: https://github.com/JuliaLang/julia/issues/9551
>
> Unfortunately, as you've seen, type-variadic keyword arguments can really 
> mess up type-inferencing. It appears that keyword argument types are pulled 
> from the default arguments rather than those actually passed in at runtime:
>
> *julia> f(x; a=1, b=2) = a*x^b*
> *f (generic function with 1 method)*
>
> *julia> f(1)*
> *1*
>
> *julia> f(1, a=(3+im), b=5.15)*
> *3.0 + 1.0im*
>
> *julia> @code_typed f(1, a=(3+im), b=5.15)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x], 
> Any[Any[Any[:x,Int64,0]],Any[],Any[Int64],Any[]], :(begin $(Expr(:line, 1, 
> :none, symbol("")))*
> *GenSym(0) = (Base.power_by_squaring)(x::Int64,2)::Int64*
> *return (Base.box)(Int64,(Base.mul_int)(1,GenSym(0)))::Int64*
> *end::Int64*
>
> Obviously, that specific call to f does NOT return an Int64.
>
> I know of only two reasonable ways to handle it at the moment:
>
> 1. If you're the method author: Restrict every keyword argument to a 
> declared, concrete type, which ensures that the argument isn't 
> type-variadic. Yichao basically gave an example of this.
> 2. If you're the method caller: Manually assert the return type. You can 
> do this pretty easily in most cases using a wrapper function. 
> Using `f` from above as an example:
>
> *julia> g{X,A,B}(x::X, a::A, b::B) = f(x, a=a, b=b)::promote_type(X, A, B)*
> *g (generic function with 2 methods)*
>
> *julia> @code_typed g(1,2,3)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x,:a,:b], 
> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Int64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>  
> :(begin  # none, line 1:*
> *return 
> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Int64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Int64)::Int64*
> *end::Int64*
>
> *julia> @code_typed g(1,2,3.0)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x,:a,:b], 
> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Float64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>  
> :(begin  # none, line 1:*
> *return 
> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Float64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Float64)::Float64*
> *end::Float64*
>
> *julia> @code_typed g(1,2,3.0+im)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x,:a,:b], 
> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Complex{Float64},0]],Any[],Any[Int64],Any[:X,:A,:B]],
>  
> :(begin  # none, line 1:*
> *return 
> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Complex{Float64},Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Complex{Float64})::Complex{Float64}*
> *end::Complex{Float64}*
>
> Thus, downstream functions can call *f* through *g, *preventing 
> type-instability from "bubbling up" to the calling methods (as it would if 
> they called *f* directly).
>
> Best,
> Jarrett
>
> On Tuesday, September 1, 2015 at 8:39:11 AM UTC-4, Michael Francis wrote:
>>
>> 2) The underlying functions are only stable if the mean passed to them is 
>> of the correct type, e.g. a number. Essentially this is a type inference 
>> issue, if the compiler was able to optimize  the branches then it would be 
>> likely be ok, it looks from the LLVM code that this is not the case today. 
>>
>> FWIW using a type stable version (e.g. directly calling covm) looks to be 
>> about 18% faster for small (100 element) AbstractArray pairs. 
>>
>> On Monday, August 31, 2015 at 9:06:58 PM UTC-4, Sisyphuss wrote:
>>>
>>> IMO:
>>> 1) This is called keyword argument (not named optional argument).
>&g

[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Jarrett Revels
Related: https://github.com/JuliaLang/julia/issues/9551

Unfortunately, as you've seen, type-variadic keyword arguments can really 
mess up type-inferencing. It appears that keyword argument types are pulled 
from the default arguments rather than those actually passed in at runtime:

*julia> f(x; a=1, b=2) = a*x^b*
*f (generic function with 1 method)*

*julia> f(1)*
*1*

*julia> f(1, a=(3+im), b=5.15)*
*3.0 + 1.0im*

*julia> @code_typed f(1, a=(3+im), b=5.15)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x], 
Any[Any[Any[:x,Int64,0]],Any[],Any[Int64],Any[]], :(begin $(Expr(:line, 1, 
:none, symbol("")))*
*GenSym(0) = (Base.power_by_squaring)(x::Int64,2)::Int64*
*return (Base.box)(Int64,(Base.mul_int)(1,GenSym(0)))::Int64*
*end::Int64*

Obviously, that specific call to f does NOT return an Int64.

I know of only two reasonable ways to handle it at the moment:

1. If you're the method author: Restrict every keyword argument to a 
declared, concrete type, which ensures that the argument isn't 
type-variadic. Yichao basically gave an example of this.
2. If you're the method caller: Manually assert the return type. You can do 
this pretty easily in most cases using a wrapper function. 
Using `f` from above as an example:

*julia> g{X,A,B}(x::X, a::A, b::B) = f(x, a=a, b=b)::promote_type(X, A, B)*
*g (generic function with 2 methods)*

*julia> @code_typed g(1,2,3)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x,:a,:b], 
Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Int64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
 
:(begin  # none, line 1:*
*return 
(top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Int64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Int64)::Int64*
*end::Int64*

*julia> @code_typed g(1,2,3.0)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x,:a,:b], 
Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Float64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
 
:(begin  # none, line 1:*
*return 
(top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Float64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Float64)::Float64*
*end::Float64*

*julia> @code_typed g(1,2,3.0+im)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x,:a,:b], 
Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Complex{Float64},0]],Any[],Any[Int64],Any[:X,:A,:B]],
 
:(begin  # none, line 1:*
*return 
(top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Complex{Float64},Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Complex{Float64})::Complex{Float64}*
*end::Complex{Float64}*

Thus, downstream functions can call *f* through *g, *preventing 
type-instability from "bubbling up" to the calling methods (as it would if 
they called *f* directly).

Best,
Jarrett

On Tuesday, September 1, 2015 at 8:39:11 AM UTC-4, Michael Francis wrote:
>
> 2) The underlying functions are only stable if the mean passed to them is 
> of the correct type, e.g. a number. Essentially this is a type inference 
> issue, if the compiler was able to optimize  the branches then it would be 
> likely be ok, it looks from the LLVM code that this is not the case today. 
>
> FWIW using a type stable version (e.g. directly calling covm) looks to be 
> about 18% faster for small (100 element) AbstractArray pairs. 
>
> On Monday, August 31, 2015 at 9:06:58 PM UTC-4, Sisyphuss wrote:
>>
>> IMO:
>> 1) This is called keyword argument (not named optional argument).
>> 2) The returned value depends only on `corzm`, and `corm`. If these two 
>> functions are type stable, then `cor` is type stable.
>> 3) I'm not sure whether this is the "correct" way to write this function.
>>
>> On Monday, August 31, 2015 at 11:48:37 PM UTC+2, Michael Francis wrote:
>>>
>>> The following is taken from statistics.jl line 428 
>>>
>>> function cor(x::AbstractVector, y::AbstractVector; mean=nothing)
>>> mean == 0 ? corzm(x, y) :
>>> mean == nothing ? corm(x, Base.mean(x), y, Base.mean(y)) :
>>> isa(mean, (Number,Number)) ? corm(x, mean[1], y, mean[2]) :
>>> error("Invalid value of mean.")
>>> end
>>>
>>> due to the 'mean' initially having a type of 'Nothing' I am unable to 
>>> inference the return type of the function - the following will return Any 
>>> for the return type.
>>>
>>> rt = {}
>>> for x in Base._methods(f,types,-1)
>>> linfo = x[3].func.code
>>> (tree, ty) = Base.typeinf(linfo, x[1], x[2])
>>> push!(rt, ty)
>>> end
>>>
>>> Each of the underlying fu

Re: [julia-users] Re: ANN: JuliaDiff -- differentiation tools in Julia

2015-08-26 Thread Jarrett Revels
Hi, 

This could be a chance to battle-test ForwardDiff.jl's new API. It takes 
exact derivatives and we're trying to make it easy to use. If your discrete 
grid is granular enough, ForwardDiff.jl should work great. Just keep in 
mind that ForwardDiff is so accurate (i.e., to machine-epsilon precision of 
the functions it evaluates), that the derivative values might not be what 
you expect - in the words of Alexey Radul: "[Automatic differentiation] 
will give you derivatives of the procedure you actually programmed, which 
may not be good approximations of the derivatives of the ideal function 
that your procedure was approximating.""

Anyway, here's an example. Let's say you had a Julia function "f" to map 
your x values to your y values:

julia> f(x) = # compute y value using your grid data.
f (generic function with 1 method)

# Assuming f: Number → Number
julia> derivf = ForwardDiff.derivative(f)
derivf (generic function with 1 method)

julia> derivf(x) # will return df(x)/dx

If f: Vector → Number, than ForwardDiff also offers "gradient" and 
"hessian" methods that are used in the same way as above (as well as a 
"jacobian" method if f: Vector → Vector).

You might also try producing a "smoother" f(x) using Interpolations.jl, 
then running that function through ForwardDiff.jl

Best,
Jarrett

On Wednesday, August 26, 2015 at 6:56:01 AM UTC-4, Spencer Lyon wrote:
>
> It would be awesome to get this under AD. 
>
> I'm not sure I follow with how to get DF(y) at each output point. Can you 
> provide a quick example (made up data is totally fine)?
>
> Thanks
>
> // Spencer
>
> On August 25, 2015 at 4:15:28 PM EDT, Christoph Ortner <
> christop...@gmail.com > wrote:
>
> In this case AD will actually work. Since 
>
> y'' = DF(y) y'
>
> you can use AD to compute DF(y) at each output point of the curve and get 
>  y'' as above.
>
> Christoph
>
>  
>
>

[julia-users] ANN: Major overhaul of ForwardDiff.jl

2015-08-14 Thread Jarrett Revels
Hi!

ForwardDiff.jl , a Julia 
package for performing automatic differentiation, has just been updated and 
is now much less buggy, much more performant, and much more comprehensively 
tested. 

An additional goal of the update was to provide a more friendly API, so 
hopefully the package is now easier to use for newcomers (documentation can 
be found in the master branch's README).

*To use the updated version of the package, check out it's most recent 
commit on the master branch*. The update is not yet tagged for release, as 
we want to give people a chance to work with the new API some before 
tagging a new version. Feel free to provide feedback in the form of an 
issue or pull request to the repository!

Note that this update only supports Julia v0.4.

Best,
Jarrett




[julia-users] Re: GSoC 2015 with JuliaQuantum

2015-05-19 Thread Jarrett Revels
Amit has already contributed a fair amount to QuBase.jl already. His and 
Alexander's work on dynamics solvers will make for an exciting summer for 
JuliaQuantum!

Looking forward to it!

-- Jarrett

On Monday, May 18, 2015 at 10:50:02 PM UTC-4, Viral Shah wrote:
>
> This is amazing! Do keep us posted on how things go.
>
> -viral
>
> On Tuesday, May 19, 2015 at 4:42:39 AM UTC+5:30, Amit Jamadagni wrote:
>>
>> Hello everyone,
>>I am most happy to inform that my project titled JuliaQuantum : 
>> Framework for Solvers 
>> 
>>  has 
>> been selected for GSoC 2015 
>>  under 
>> NumFOCUS umbrella. I will be working on creating a framework for solvers 
>> used in Quantum Mechanics under the mentor-ship of Alexander Croy and 
>> support from the Julia Quantum team. The project will be using 
>> implementations from QuBase.jl, ODE.jl, Expokit.jl and also will aim to 
>> implement few enhancements in the packages in the course of development. 
>> The updates of the project can be followed under the news 
>>  section on the 
>> JuliaQuantum  site. The end result of 
>> the project will be to populate the package QuDynamics.jl 
>>  with features outlined 
>> in the project proposal 
>> .
>>  
>> It would be great to hear from the community on the above mentioned ideas.
>>
>> Thanking you,
>> Amit. 
>>
>

Re: [julia-users] ANN: QuDirac.jl

2015-04-26 Thread Jarrett Revels
Shor's algorithm can definitely be simulated with QuDirac, and is certainly 
a good candidate for the `examples` folder :) I'm pretty crunched for time 
today and tomorrow but I'll see if I can't bang out an implementation this 
week and post it back here. 

On Saturday, April 25, 2015 at 4:21:11 PM UTC-4, Christian Peel wrote:
>
> Nice!
>
> I'm curious how tough it would be to simulate Shor's algorithm, even if 
> it's just to factor 15; Is there some reason that this is too tough to 
> include as an example?
>
> chris
>
> On Thu, Apr 23, 2015 at 6:02 PM, Jarrett Revels  > wrote:
>
>> I'm happy to say that I've finally released QuDirac.jl 
>> <https://github.com/JuliaQuantum/QuDirac.jl>!
>>
>> This package is for performing common quantum mechanical operations using 
>> Dirac notation.
>>
>> Feature list:
>>
>> - Implementations of state types (`Ket`,`Bra`), and a variety of operator 
>> types (`OpSum`,`OuterProduct`)
>> - Treat states and operators as map-like data structures, enabling 
>> label-based analysis for spectroscopy purposes
>> - Implementation of common operations like partial trace (`ptrace`) and 
>> partial transpose (`ptranspose`)
>> - Support for abstract/undefined inner products
>> - User-definable custom inner product rules
>> - Subspace selection/transformation via functions on state labels and 
>> coefficients:
>> - `xsubspace` allows easy selection of excitation subspaces of states 
>> and operators
>> - `permute` and `switch` allows generic permutation of factor labels 
>> for states
>> - `filter`/`filter!` are supported on both the labels and 
>> coefficients of operators/states
>> - Mapping functions (`map`/`maplabels`/`mapcoeffs`) for applying 
>> arbitrary functions to labels and coefficients
>> - Functional generation of operators using `@def_op` and `@rep_op`
>> - `d" ... "` literals for natural Dirac notation input syntax
>>
>> -- Jarrett
>>
>
>
>
> -- 
> chris...@ieee.org 
>  


[julia-users] ANN: QuDirac.jl

2015-04-23 Thread Jarrett Revels
I'm happy to say that I've finally released QuDirac.jl 
!

This package is for performing common quantum mechanical operations using 
Dirac notation.

Feature list:

- Implementations of state types (`Ket`,`Bra`), and a variety of operator 
types (`OpSum`,`OuterProduct`)
- Treat states and operators as map-like data structures, enabling 
label-based analysis for spectroscopy purposes
- Implementation of common operations like partial trace (`ptrace`) and 
partial transpose (`ptranspose`)
- Support for abstract/undefined inner products
- User-definable custom inner product rules
- Subspace selection/transformation via functions on state labels and 
coefficients:
- `xsubspace` allows easy selection of excitation subspaces of states 
and operators
- `permute` and `switch` allows generic permutation of factor labels 
for states
- `filter`/`filter!` are supported on both the labels and coefficients 
of operators/states
- Mapping functions (`map`/`maplabels`/`mapcoeffs`) for applying 
arbitrary functions to labels and coefficients
- Functional generation of operators using `@def_op` and `@rep_op`
- `d" ... "` literals for natural Dirac notation input syntax

-- Jarrett


Re: [julia-users] Use of convert() within setindex!() in array.jl

2014-04-26 Thread Jarrett Revels
Ah, makes sense. Thanks! 

On Saturday, April 26, 2014 12:33:48 PM UTC-4, Tim Holy wrote:
>
> When x is of type T, convert(T,x) is a no-op: 
>
> julia> function myfunc(x) 
>a = convert(Float64, x) 
>return a 
>end 
> myfunc (generic function with 1 method) 
>
> julia> code_native(myfunc, (Float32,)) 
> .text 
> Filename: none 
> Source line: 2 
> pushRBP 
> mov RBP, RSP 
> Source line: 2 
> movabs  RAX, 140048916337192 
> movss   DWORD PTR [RAX], XMM0 
> movss   XMM0, DWORD PTR [RAX] 
> cvtss2sdXMM0, XMM0 
> Source line: 3 
> pop RBP 
> ret 
>
> julia> code_native(myfunc, (Float64,)) 
> .text 
> Filename: none 
> Source line: 3 
> pushRBP 
> mov RBP, RSP 
> Source line: 3 
>     pop RBP 
> ret 
>
>
> --Tim 
>
> On Saturday, April 26, 2014 08:45:21 AM Jarrett Revels wrote: 
> > Hello all, 
> > 
> > I was exploring some of the base code today when I saw the 
> implementation 
> > of setindex!(): 
> > 
> > 
> > setindex!{T}(A::Array{T}, x, i0::Real) = arrayset(A, convert(T,x), 
> > to_index(i0)) 
> > 
> > I was surprised that I couldn't find setindex!() methods for elements 
> that 
> > used the type system to forego the use of convert() seen above. For 
> > example, wouldn't it be better to also define methods of the form 
> > 
> > setindex!{T}(A::Array{T}, x::T, i0::Real) = arrayset(A, x, to_index(i0)) 
> > 
> > so that conversion only occurs when necessary? 
> > 
> > I would be willing to make the changes myself, but I assume that there 
> is a 
> > reason for this design choice that I'm just seeing. 
>


[julia-users] Use of convert() within setindex!() in array.jl

2014-04-26 Thread Jarrett Revels
Hello all,

I was exploring some of the base code today when I saw the implementation 
of setindex!():


setindex!{T}(A::Array{T}, x, i0::Real) = arrayset(A, convert(T,x), to_index(i0))

I was surprised that I couldn't find setindex!() methods for elements that used 
the type system to forego the use of convert() seen above. 
For example, wouldn't it be better to also define methods of the form 

setindex!{T}(A::Array{T}, x::T, i0::Real) = arrayset(A, x, to_index(i0))

so that conversion only occurs when necessary?

I would be willing to make the changes myself, but I assume that there is a 
reason for this design choice that I'm just seeing. 



Re: [julia-users] Type Parameter Scope Inconsistency?

2014-03-21 Thread Jarrett Revels
I also made a typo in my question; the parameter I'm referring to is the 
parameter of Vector in x, i.e. T in x::Dict{Int, Vector{T}}.

On Friday, March 21, 2014 5:44:19 PM UTC-4, Jarrett Revels wrote:
>
> Here's a gist clarifying what I'm trying to do in my second question: 
> https://gist.github.com/jrevels/9697093
>
>>
>>>

Re: [julia-users] Type Parameter Scope Inconsistency?

2014-03-21 Thread Jarrett Revels
Here's a gist clarifying what I'm trying to do in my second 
question: https://gist.github.com/jrevels/9697093

>
>>

Re: [julia-users] Type Parameter Scope Inconsistency?

2014-03-20 Thread Jarrett Revels
Ah, that makes sense, thanks. 

The reason why I originally wrote that off without trying it (not a good 
idea, I suppose!) is that I thought it would give me the same kind 
of behavior exhibited when you define a function like:

f{N<:Number}(a::N, b::N)

which asserts that a and b are the same type N (e.g. f(Float, Int) would 
throw a no method error) as opposed to the more general assertion 
that typeof(a)<:Number and typeof( b)<:Number.

So B<:Bar works in this instance because it is the parameter of another 
type (Array, in this case), unlike N in the above example?

I also have a follow up question in this same vein. Let's say I define Foo 
to have an extra parameter x:

type Foo{B<:Bar}
y::Vector{B}
x::Dict{Int, Vector}
end

I want to assert that the type parameter of x::Vector equals the type 
parameter of B if B is specialized for a specific type, 
or is type Any if eltype(y)=Bar{T}.  

I could envision writing a constructor that typejoins all types of the 
elements of y, and uses that to determine 
the appropriate type parameter for Vector in x, but is there a way to have 
this happen naturally solely with assertions? 

If I did write a constructor instead of relying entirely on type 
constructors, I suppose I would have to do 
the following to account for the fact that the assertion is not in the type 
definition itself:

type Foo{B<:Bar, V<:Vector}
y::Vector{B}
x::Dict{Int, V}
end

Thanks, 
Jarrett

On Thursday, March 20, 2014 2:12:28 AM UTC-4, Jameson wrote:
>
> There is no T which satisfies the type parameter for Foo(B). Therefore 
> this type cannot be constructed.
>
> If you wanted to create a type which is specialized on any Bar object, 
> then you must do so at the type parameter level:
> type Foo{B<:Bar}
> y::Array{B}
> end
>
> Note, that Array{T} is still not fully specialized. You likely want 
> Vector{T} or Array{T,1} so that the access to the Foo.y element can be 
> fully optimized.
>
>
> On Thu, Mar 20, 2014 at 1:03 AM, Jarrett Revels 
> 
> > wrote:
>
>> Hello Julianistas, 
>>
>> I'm running into what seems to be a strange inconsistency in the way the 
>> compiler is treating type parameters.
>>
>> Consider the following code:
>>
>> type Bar{T}
>> x::T
>> end
>>
>> type Foo{T}
>> y::Array{Bar{T}}
>> end
>>
>> A = [Bar(1), Bar(2), Bar(3)]
>> B = [Bar(1), Bar("A"), Bar({1,2,3})]
>>
>> There are two ways one might interpret how T is scoped in Foo{T}. I coded 
>> this test to investigate which way Julia interprets it, but the problem is 
>> that it seems to be interpreting it in two different ways at two separate 
>> points. 
>>
>> The first way to interpret it: T is the type parameter of EVERY Bar in an 
>> Array, and is explicitly consistent across Bars, i.e. every Bar in the 
>> array is of type Bar{T2} where T2 equals the same T as in Foo{T}.
>>
>> The second way to interpret it: T refers to the type parameter of each 
>> individual Bar in the Array, and is not explicitly consistent across Bars. 
>> Thus, if the individual Bars have different type parameters, the overall 
>> type of the array would be Array{Bar{T},1}.
>>
>> To test which interpretation Julia uses, I use the arrays A and B above. 
>> Check out the output when I try to construct different instances of Foo 
>> with each:
>>
>> julia> Foo(A)
>> Foo{Int64}([Bar{Int64}(1),Bar{Int64}(2)])
>>
>> julia> Foo(B)
>> ERROR: no method Foo{T}(Array{Bar{T},1})
>>
>> What's happening here? It seems like the compiler is constraining the 
>> constructor to the first interpretation, but treats the actual type of the 
>> object B as it would using the second interpretation. Explicitly:
>>
>> julia> typeof(A)
>> Array{Bar{Int64},1}
>>
>> julia> typeof(B)
>> Array{Bar{T},1}
>>
>> How can this be the case? I have also tried creating an explicit outer 
>> constructor, but got the same error.
>>
>> Thanks for any insights, 
>> Jarrett
>>
>
>

[julia-users] Type Parameter Scope Inconsistency?

2014-03-19 Thread Jarrett Revels
Hello Julianistas, 

I'm running into what seems to be a strange inconsistency in the way the 
compiler is treating type parameters.

Consider the following code:

type Bar{T}
x::T
end

type Foo{T}
y::Array{Bar{T}}
end

A = [Bar(1), Bar(2), Bar(3)]
B = [Bar(1), Bar("A"), Bar({1,2,3})]

There are two ways one might interpret how T is scoped in Foo{T}. I coded 
this test to investigate which way Julia interprets it, but the problem is 
that it seems to be interpreting it in two different ways at two separate 
points. 

The first way to interpret it: T is the type parameter of EVERY Bar in an 
Array, and is explicitly consistent across Bars, i.e. every Bar in the 
array is of type Bar{T2} where T2 equals the same T as in Foo{T}.

The second way to interpret it: T refers to the type parameter of each 
individual Bar in the Array, and is not explicitly consistent across Bars. 
Thus, if the individual Bars have different type parameters, the overall 
type of the array would be Array{Bar{T},1}.

To test which interpretation Julia uses, I use the arrays A and B above. 
Check out the output when I try to construct different instances of Foo 
with each:

julia> Foo(A)
Foo{Int64}([Bar{Int64}(1),Bar{Int64}(2)])

julia> Foo(B)
ERROR: no method Foo{T}(Array{Bar{T},1})

What's happening here? It seems like the compiler is constraining the 
constructor to the first interpretation, but treats the actual type of the 
object B as it would using the second interpretation. Explicitly:

julia> typeof(A)
Array{Bar{Int64},1}

julia> typeof(B)
Array{Bar{T},1}

How can this be the case? I have also tried creating an explicit outer 
constructor, but got the same error.

Thanks for any insights, 
Jarrett