Thanks John, 

Still trying to get a hold of the basics but this seemed to do the trick:

res = optimize(f,g!,[0.0, 0.0], method = 
:gradient_descent,store_trace=true,extended_trace=true);

n = res.iterations+1;
x_iter = zeros(2,n);
for i = 1:n
  x_iter[:,i] = res.trace.states[i].metadata["x"];
end

Best, 
Oliver

Den lørdag den 4. januar 2014 17.19.56 UTC+1 skrev John Myles White:
>
> Hi Oliver, 
>
> If you’re looking for the current state at each iteration, you want to 
> check res.trace.states. 
>
> One way to field out this kind of information is to use the names 
> function: 
>
> julia> names(res) 
> 15-element Array{Symbol,1}: 
>  :method             
>  :initial_x           
>  :minimum             
>  :f_minimum           
>  :iterations         
>  :iteration_converged 
>  :x_converged         
>  :xtol               
>  :f_converged         
>  :ftol               
>  :gr_converged       
>  :grtol               
>  :trace               
>  :f_calls             
>  :g_calls             
>
> julia> names(res.trace) 
> 1-element Array{Symbol,1}: 
>  :states 
>
> Of course, you still need to figure out what the fields mean, but we’ve 
> tried to use sensible names for the fields in Optim. 
>
>  — John 
>
> On Jan 4, 2014, at 11:14 AM, Oliver Lylloff 
> <oliver...@gmail.com<javascript:>> 
> wrote: 
>
> > Hi John, 
> > Thanks for the fast answer. 
> > 
> > I still don't think I get the output I want. The res.minimum only return 
> the converged solution (one vector), however I would like to get the 
> minimum for all 1 to N iterations (N vectors). Setting store_trace=true 
> gets me closer, since res.trace prints the information I'm seeking but not 
> as an field (res.trace.x). 
> > 
> > Hope it makes sense. 
> > Best, 
> > Oliver 
> > 
> > Den lørdag den 4. januar 2014 16.56.03 UTC+1 skrev John Myles White: 
> > Hi Oliver, 
> > 
> > The result of optimize is an object with a field called minimum that has 
> the solution. 
> > 
> > Try something like the following: 
> > 
> > julia> res = optimize(x -> (10.0 - x[1])^2, [0.0], method = 
> :gradient_descent) 
> > Results of Optimization Algorithm 
> >  * Algorithm: Gradient Descent 
> >  * Starting Point: 0 
> > 
> >  * Minimum: 10.000000000118629 
> > 
> >  * Value of Function at Minimum: 0.000000 
> >  * Iterations: 1 
> >  * Convergence: true 
> >    * |x - x'| < 1.0e-32: false 
> >    * |f(x) - f(x')| / |f(x)| < 1.0e-08: false 
> >    * |g(x)| < 1.0e-08: true 
> >    * Exceeded Maximum Number of Iterations: false 
> >  * Objective Function Calls: 4 
> >  * Gradient Call: 4 
> > 
> > julia> res.minimum 
> > 1-element Array{Float64,1}: 
> >  10.0 
> > 
> >  — John 
> > 
> > On Jan 4, 2014, at 10:52 AM, Oliver Lylloff <oliver...@gmail.com> 
> wrote: 
> > 
> > > Hello all, 
> > > 
> > > I'm trying to get acquainted with the Optim package - so far I think 
> everything looks very interesting. 
> > > I would like to get the solution vector of each iteration as an output 
> (e.g. for plotting) - how do I do that? 
> > > 
> > > Trying the Rosenbrock example from 
> https://github.com/JuliaOpt/Optim.jl with extended_trace option 
> > > 
> > > optimize(f,g!,[0.0,0.0],method=:gradient_descent,extended_trace=true) 
> > > 
> > > prints the solution vector x at each iteration but I can't seem to 
> access it and store it. Any ideas? 
> > > 
> > > 
> > > Best, 
> > > Oliver 
> > > 
> > > 
> > 
>
>

Reply via email to