I didn't look at ll2. But that one seems OK.
I didn't read the whole thread; are you timing just the execution of the
objective function, or of the whole optimization? You can't easily interpret
the latter.
--Tim
On Sunday, June 22, 2014 09:13:49 AM Thibaut Lamadon wrote:
> Hi Tim
>
> is this
Hi Tim
is this a concern even-though I declare u1::Float64 = 0; at the beginning
of the function, in ll2?
t.
On Sunday, 22 June 2014 15:57:53 UTC+1, Tim Holy wrote:
>
> If x1, ..., x6 or coeff are Float64 arrays, then the initialization
>
> u1 = 0; u2 = 0; u3 = 0; u4 = 0; u5 = 0; u6 = 0
If x1, ..., x6 or coeff are Float64 arrays, then the initialization
u1 = 0; u2 = 0; u3 = 0; u4 = 0; u5 = 0; u6 = 0
is problematic as soon as you get to
for k=1:nVar
u1 += x1[i + ni*( k-1 + nk* (t-1))]*coeff[k]
u2 += x2[i + ni*( k-1 + nk* (t-1))]*coeff[k]
Hi guys,
I wanted to look into this as well. The main issue I think is in the speed
of the objective function. Running @time on the objective function
suggested a large amount of byte allocation. Checking the type revealed
that getting x and y from data would set their types to Any.
So I conv
Hi John, hi Miles,
Thanks to both of you. I did not have time to look into this over the
weekend; I will do so in the next couple of days. I have already uploaded
the Matlab files for comparison:
https://gist.github.com/stichnoth/7f251ded83dcaa384273
Holger
On Thursday, 22 May 2014 23:03:58
Yeah, this case is tricky enough that we really need to get down to the lowest
details:
(1) Do Julia and Matlab perform similar numbers of function evaluations?
(2) If they don't perform similar numbers of function evaluations, is one of
them producing a better solution? Is the one that's produ
I can get another 50% speedup by:
- Running the optimization twice and timing the second run only, this is
the more appropriate way to benchmark julia because it excludes the
function compilation time
- Setting autodiff=true
- Breaking up the long chains of sums, apparently these seem to be slow
How many function evaluations is Matlab performing and how many is Julia
performing?
— John
On May 22, 2014, at 6:53 AM, Holger Stichnoth wrote:
> Thanks, it's faster now (by roughly a factor of 3 on my computer), but still
> considerably slower than fminunc:
>
> Averages over 20 runs:
> Ju
Thanks, it's faster now (by roughly a factor of 3 on my computer), but
still considerably slower than fminunc:
Averages over 20 runs:
Julia/Optim.optimize: 10.5s
Matlab/fminunc: 2.6s
Here are my Matlab settings:
options = optimset('Display', 'iter', ...
'MaxIter', 2500, 'MaxFunEvals', 5
I was able to get a nearly 5x speedup by avoiding the matrix allocation and
making the accumulators type
stable: https://gist.github.com/mlubin/055690ddf2466e98bba6
How does this compare with Matlab now?
On Thursday, May 22, 2014 6:38:44 AM UTC-4, Holger Stichnoth wrote:
>
> @ John: You are rig
@ John: You are right, when I specify the function as
clogit_ll(beta::Vector) instead of clogit_ll(beta::Vector{Float64}),
autodiff = true works fine. Thanks for your help!
@ Tim: I have set the rather strict default convergence criteria of
Optim.optimize to Matlab's default values for fminunc,
Just to extend on what John said, also think that if you can restructure
the code to devectorize it and avoid using global variables, you'll see
*much* better performance.
The way to avoid globals is by using closures, for example:
function foo(x, data)
...
end
...
data_raw = readcsv(file
On Tuesday, May 20, 2014 08:34:21 AM Holger Stichnoth wrote:
> The only other thing that I discovered when trying out Julia and Optim.jl
> is that the optimization is currently considerably slower than Matlab's
> fminunc. From the Gist I provided above, are there any potential
> performance impr
Yes, to use autodiff you need to make sure that all of the functions you call
could be applied to Array{T} for all T <: Number. The typing on your code is
currently overly restrictive when you define clogit_ll(beta::Vector{Float64})
and friends. If you loosen things to clogit_ll(beta::Vector), y
Glad that you were able to figure out the source of your problems.
It would be good to get a sense of the amount of time spent inside your
objective function vs. the amount of time spent in the code for optimize(). In
general, my experience is that >>90% of the compute time for an optimization
When I set autodiff = true in the Gist I posted above, I get the message
"ERROR: no method clogit_ll(Array{Dual{Float64},1},)".
Holger
On Monday, 19 May 2014 14:51:16 UTC+1, John Myles White wrote:
>
> If you can, please do share an example of your code. Logit-style models
> are in general num
Hi Andreas,
hi John,
hi Miles (via julia-opt, where I mistakenly also posted my question
yesterday),
Thanks for your help. Here is the link to the Gist I created:
https://gist.github.com/anonymous/5f95ab1afd241c0a5962
In the process of producing a minimal (non-)working example, I discovered
th
If you can, please do share an example of your code. Logit-style models are in
general numerically unstable, so it would be good to see how exactly you’ve
coded things up.
One thing you may be able to do is use automatic differentiation via the
autodiff = true keyword to optimize, but that assu
What is the output of versioninfo() and Pkg.installed("Optim")? Also, would
it be possible to make a gist with your code?
2014-05-19 12:44 GMT+02:00 Holger Stichnoth :
> Hello,
>
> I installed Julia a couple of days ago and was impressed how easy it was
> to make the switch from Matlab and to p
Hello,
I installed Julia a couple of days ago and was impressed how easy it was to
make the switch from Matlab and to parallelize my code
(something I had never done before in any language; I'm an economist with
only limited programming experience, mainly in Stata and Matlab).
However, I ran i
20 matches
Mail list logo