Short update: the merger of this PR in Optim.jl has been postponed.
Meanwhile it is available as a separate package:
Pkg.clone(https://github.com/Ken-B/MinFinder.jl.git;) Feel free to try it
out and, as usual, all feedback welcome :)
Ken
On Sunday, 27 July 2014 23:06:34 UTC-5, Ken B wrote:
Hi Charles,
You can have a look at the MinFinder algorithm for which I've just created
a pull request to Optim.jl (talk about a coincidence!):
https://github.com/JuliaOpt/Optim.jl/pull/72
I'd like to add the possibility to run each optimization in parallel, but I
have no experience with these
Hi Ken
Interesting code you have there. I will have to take a closer look at it.
Yes I would be happy to collaborate. But let me first try my problem out in
Julia .. I am new to Julia and I am currently debating whether my code that
I want to process will be faster in Python using mpi4py or
Ken:
(1) Thanks for pointing out this approach and for implementing it.
Unfortunately, I was not able to locate your code at Github. I would
certainly try it out on some of my examples in global optimization.
(2) Did you include (or do you plan to include) the improvements of
MinFinder,
as
A package of test functions sounds worthwhile. There's also CUTEst.jl:
https://github.com/lpoo/CUTEst.jl
--Tim
On Sunday, July 27, 2014 06:25:28 AM Hans W Borchers wrote:
Ken:
(1) Thanks for pointing out this approach and for implementing it.
Unfortunately, I was not able to locate your
Is CUTEst.jl easier to get working these days? The issue I opened in March
seems to still be open.
— John
On Jul 27, 2014, at 6:40 AM, Tim Holy tim.h...@gmail.com wrote:
A package of test functions sounds worthwhile. There's also CUTEst.jl:
https://github.com/lpoo/CUTEst.jl
--Tim
On
I haven't tested. I should do so.
--Tim
On Sunday, July 27, 2014 08:51:51 AM John Myles White wrote:
Is CUTEst.jl easier to get working these days? The issue I opened in March
seems to still be open.
— John
On Jul 27, 2014, at 6:40 AM, Tim Holy tim.h...@gmail.com wrote:
A package of
Nope.
One could write a SIF parser from scratch, but it would take some time.
--Tim
On Sunday, July 27, 2014 08:51:51 AM John Myles White wrote:
Is CUTEst.jl easier to get working these days? The issue I opened in March
seems to still be open.
— John
On Jul 27, 2014, at 6:40 AM, Tim
There is a large number of optimization test functions (for
derivative-free, black-box optimization) implemented in pure Julia in
Robert Feldt's BlackBoxOptim.jl. From the source code comments in
single_objective.jl, functions are taken from the following sources:
* CEC 2013 competition on
Hi Hans,
1) Your welcome :) The code is in the PR:
https://github.com/JuliaOpt/Optim.jl/pull/73 and if all goes well could
merge soon. Do try it out and share the experience.
2) Mindfinder 2.0 introduces 2 new stopping rules and an extra validation
rule for sample points. If there is
The idea is to call the optimize function multiple times in parallel, not
to call it once and let it do parallel multistart.
Check out the parallel map and loops section of the parallel programming
chapter in the Julia manual, I think it'll be clearer there.
On Friday, July 25, 2014 8:00:40 PM
What you are doing makes sense. Starting from multiple starting points is
important.
I am curious why you just don't just run 20 different 1-processor jobs
instead of bothering with the parallelism?
On Saturday, July 26, 2014 11:22:07 AM UTC-5, Iain Dunning wrote:
The idea is to call the
Yes I could do that but it is simpler (I think) to execute the code in
parallel instead of sending 20 codes to be executed on the cluste.r
On Saturday, July 26, 2014 10:08:20 AM UTC-7, Michael Prentiss wrote:
What you are doing makes sense. Starting from multiple starting points is
I'm not familiar with that particular package, but the Julia way to do it
could be to use the Optim.jl package and create a random set of starting
points, and do a parallel-map over that set of starting points. Should work
quite well. Trickier (maybe) would be to just give each processor a
Thank you for your answer. So I would have to loop over, say 20 random set
of starting points, where in my loop I would use the Optim package to
minimize my MLE function for each random set. Where online is the documents
that shows how to specify that we want the command
Optim.optimize(my
15 matches
Mail list logo