What you are doing makes sense.  Starting from multiple starting points is 
important.

I am curious why you just don't just run 20 different 1-processor jobs 
instead of bothering with the parallelism?


On Saturday, July 26, 2014 11:22:07 AM UTC-5, Iain Dunning wrote:
>
> The idea is to call the optimize function multiple times in parallel, not 
> to call it once and let it do parallel multistart.
>
> Check out the "parallel map and loops" section of the parallel programming 
> chapter in the Julia manual, I think it'll be clearer there.
>
> On Friday, July 25, 2014 8:00:40 PM UTC-4, Charles Martineau wrote:
>>
>> Thank you for your answer. So I would have to loop over, say 20 random 
>> set of starting points, where in my loop I would use the Optim package to 
>> minimize my MLE function for each random set. Where online is the documents 
>> that shows how to specify that we want the command 
>>
>> Optim.optimize(my function, etc.) to be parallelized? Sorry for my 
>> ignorance, I am new to Julia!
>>
>>
>> On Friday, July 25, 2014 2:04:08 PM UTC-7, Iain Dunning wrote:
>>>
>>> I'm not familiar with that particular package, but the Julia way to do 
>>> it could be to use the Optim.jl package and create a random set of starting 
>>> points, and do a parallel-map over that set of starting points. Should work 
>>> quite well. Trickier (maybe) would be to just give each processor a 
>>> different random seed and generate starting points on each processor.
>>>
>>> On Friday, July 25, 2014 3:05:05 PM UTC-4, Charles Martineau wrote:
>>>>
>>>> Dear Julia developers and users,
>>>>
>>>> I am currently using in Matlab the multisearch algorithm to find 
>>>> multiple local minima: 
>>>> http://www.mathworks.com/help/gads/multistart-class.html for a MLE 
>>>> function.
>>>> I use this Multisearch in a parallel setup as well.
>>>>
>>>> Can I do something similar in Julia using parallel programming?
>>>>
>>>> Thank you
>>>>
>>>> Charles
>>>>
>>>>

Reply via email to