Hi John,
Thanks for the fast answer. 

I still don't think I get the output I want. The res.minimum only return 
the converged solution (one vector), however I would like to get the 
minimum for all 1 to N iterations (N vectors). Setting store_trace=true 
gets me closer, since res.trace prints the information I'm seeking but not 
as an field (res.trace.x).

Hope it makes sense.
Best,
Oliver

Den lørdag den 4. januar 2014 16.56.03 UTC+1 skrev John Myles White:
>
> Hi Oliver, 
>
> The result of optimize is an object with a field called minimum that has 
> the solution. 
>
> Try something like the following: 
>
> julia> res = optimize(x -> (10.0 - x[1])^2, [0.0], method = 
> :gradient_descent) 
> Results of Optimization Algorithm 
>  * Algorithm: Gradient Descent 
>  * Starting Point: 0 
>
>  * Minimum: 10.000000000118629 
>
>  * Value of Function at Minimum: 0.000000 
>  * Iterations: 1 
>  * Convergence: true 
>    * |x - x'| < 1.0e-32: false 
>    * |f(x) - f(x')| / |f(x)| < 1.0e-08: false 
>    * |g(x)| < 1.0e-08: true 
>    * Exceeded Maximum Number of Iterations: false 
>  * Objective Function Calls: 4 
>  * Gradient Call: 4 
>
> julia> res.minimum 
> 1-element Array{Float64,1}: 
>  10.0 
>
>  — John 
>
> On Jan 4, 2014, at 10:52 AM, Oliver Lylloff 
> <oliver...@gmail.com<javascript:>> 
> wrote: 
>
> > Hello all, 
> > 
> > I'm trying to get acquainted with the Optim package - so far I think 
> everything looks very interesting. 
> > I would like to get the solution vector of each iteration as an output 
> (e.g. for plotting) - how do I do that? 
> > 
> > Trying the Rosenbrock example from https://github.com/JuliaOpt/Optim.jlwith 
> > extended_trace option 
> > 
> > optimize(f,g!,[0.0,0.0],method=:gradient_descent,extended_trace=true) 
> > 
> > prints the solution vector x at each iteration but I can't seem to 
> access it and store it. Any ideas? 
> > 
> > 
> > Best, 
> > Oliver 
> > 
> > 
>
>

Reply via email to