Dear All,

I am trying to figure out the complexity level of the gradient method for
minimizing an objective function.

Assume J is the scalar objection function of N vectors, each vector is of k
dimension.
Now we are trying to find a k-dimension vector x such that J is minimized,
e.g. J is the sum of squares of the distances from N points in R^k space to
a line segment,which
is connected by a known vector a and the unknown point x in R^k space.

So given this definition of J, the way we did is to calculate dJ/dx = 0 and
find the vector x.
However, the expression dJ/dx is not usually in explicit form, i.e, x can
not directly be obtained and
have to be done by some iterative approach.

Then for this kind of optimization problem, how to define the computational
complexity?
It is O(kN) or O(kN^2)?

And how to verify this?

Thanks a lot for your point.

Fred




.
.
=================================================================
Instructions for joining and leaving this list, remarks about the
problem of INAPPROPRIATE MESSAGES, and archives are available at:
.                  http://jse.stat.ncsu.edu/                    .
=================================================================

Reply via email to