Oleg,
would you mind posting the ceres issues either on the ceres mailinglist or
on https://github.com/ceres-solver/ceres-solver/issues.
Sameer


On Mon, Jun 1, 2020 at 8:44 AM Oleg Shirokobrod <[email protected]>
wrote:

> Hi Sameer,
>
> Thank you for quick reply and advices.
> 1. I implemented using both decompositions SVD and QR. QR decomposition
> has the same problem as SVD for Jet but explicit expression for solution is
> much more complex.
> 2. I implemented not only CostFunctor with operator()() for autodiff but
> also CostFunction with Evaluate method. In this method I calculate
> residuals and Jacobian using method similar to your proposal. This is main
> idea of variable projection method. Using pseudo-inverse solution from SVD
> or QR decomposition eliminate linear parameters (project linear variable
> out) and then calculate Jcobian using closed expression for residuals. But
> calculating Jacobian is not trivial 1),2) and time consuming. I am going to
> profile both approaches. Physical meaning of my simplified model is linear
> combination of gaussian. Linear parameters are their amplitudes, nonlinear
> are peak positions and widths. When we use unconstraint linear problem,
> there is closed expression for residuals which we can differentiate. If we
> put constraints on linear parameters, e.g. nonnegativity of them, we do not
> have closed expressions for residuals, we need differentiate some
> algorithm, for example Quadratic  Programming with constrains. There are
> articles which offer calculating Jcobian of close task near optimum without
> constraints. I think it is beneficial in this case calculate Jcobian of
> algorithm solving exact task.
> 3. BW. I compiled ceres solver, glog and gflags with VS2019. I do not
> remember if I tweaked a bit CMake VS project. I can provide solutions for
> these projects. But I added derivative of erf and erfc functions, which I
> use and which are part of C++11 standard. Besides I have to enclose in to
> pragmas
> #ifdef _MSC_VER
> #pragma optimize( "", off )
> #endif // _MSC_VER
>
> #ifdef _MSC_VER
> #pragma optimize( "", on )
> #endif // _MSC_VER
>
> class GOOGLE_GLOG_DLL_DECL CheckOpMessageBuilder and class
> GOOGLE_GLOG_DLL_DECL LogMessageVoidify in glog::logging.h without this VS
> compiler in release mode optimizes out some member functions. After that
> linker gives unresolved external error message.
>
> References
> 1) G. H. Golub and V. Pereyra, The differentiation of pseudo-inverses and
> nonlinear least squares whose variables separable. SIAM J. Numer. Anal.
> Vol. 10. Num. 2.1973.
> 2). O Leary. Rust. Variable Projection for Nonlinear Least Squares
> Problems.
>
> Again thank you very much. I will think of your advice.
>
> Oleg
>
>
> On Mon, Jun 1, 2020 at 4:57 PM Sameer Agarwal <[email protected]>
> wrote:
>
>> Oleg,
>> Two ideas:
>>
>> 1. You may have an easier time using QR factorization instead of SVD to
>> solve your least squares problem.
>> 2.  But you can do better, instead of trying to solve linear least
>> squares problem involving a matrix of Jets, you are better off, solving the
>> linear least squares problem on the scalars, and then using the implicit
>> function theorem
>> <https://en.wikipedia.org/wiki/Implicit_function_theorem> to compute the
>> derivative w.r.t the parameters and then applying the chain rule.
>>
>> i.e., start with min |A x = b|
>>
>> the solution satisfies the equation
>>
>> A'A x - A'b = 0.
>>
>> solve this equation to get the optimal value of x, and then compute the
>> jacobian of this equation w.r.t A, b and x. and apply the implicit theorem.
>>
>> Sameer
>>
>>
>> On Mon, Jun 1, 2020 at 4:46 AM Oleg Shirokobrod <
>> [email protected]> wrote:
>>
>>> Hi list, I am using Eigen 3.3.7 release with ceres solver 1.14.0 with
>>> autodiff Jet data type and I have some problems. I need to solve linear
>>> least square subproblem within variable projection algorithm, namely I need
>>> to solve LLS equation
>>> A(p)*x = b
>>> Where matrix A(p) depends on nonlinear parameters p:
>>> x(p) = pseudo-inverse(A(p))*b;
>>> x(p) will be optimized in nonlinear least squares fitting, so I need
>>> Jcobian. Rhs b is measured vector of doubles, e.g. VectorXd. In order to
>>> use ceres's autodiff p must be of Jet type. Ceres provides corresponding
>>> traits for binary operations
>>>
>>> #if EIGEN_VERSION_AT_LEAST(3, 3, 0)
>>> // Specifying the return type of binary operations between Jets and
>>> scalar types
>>> // allows you to perform matrix/array operations with Eigen matrices and
>>> arrays
>>> // such as addition, subtraction, multiplication, and division where one
>>> Eigen
>>> // matrix/array is of type Jet and the other is a scalar type. This
>>> improves
>>> // performance by using the optimized scalar-to-Jet binary operations but
>>> // is only available on Eigen versions >= 3.3
>>> template <typename BinaryOp, typename T, int N>
>>> struct ScalarBinaryOpTraits<ceres::Jet<T, N>, T, BinaryOp> {
>>>   typedef ceres::Jet<T, N> ReturnType;
>>> };
>>> template <typename BinaryOp, typename T, int N>
>>> struct ScalarBinaryOpTraits<T, ceres::Jet<T, N>, BinaryOp> {
>>>   typedef ceres::Jet<T, N> ReturnType;
>>> };
>>> #endif  // EIGEN_VERSION_AT_LEAST(3, 3, 0)
>>>
>>> There two problems.
>>> 1. Small problem. In a function "RealScalar threshold() const" in
>>> SCDbase.h I have to replace "return m_usePrescribedThreshold ?
>>> m_prescribedThreshold
>>>                                     : diagSize*
>>> NumTraits<Scalar>::epsilon();" with "return m_usePrescribedThreshold ?
>>> m_prescribedThreshold
>>>                                     : Scalar(diagSize)*
>>> NumTraits<Scalar>::epsilon();"
>>> This fix is similar Gael's fix of Bug 1403
>>> <http://eigen.tuxfamily.org/bz/show_bug.cgi?id=1403>
>>> 2. It is less trivial. I expect that x(p) = pseudo-inverse(A(p))*b; is
>>> vector of Jet. And it is actually true for e.g SVD decompoazition
>>> x(p) = VSU^T * b.
>>> But if I use
>>> JcobySVD<Matrix<Jet<double, 2>, Dynamic, Dynamic>> svd(A);
>>> x(p) = svd.solve(b),
>>> I got error message.
>>> Here code for reproducing the error
>>>
>>> // test_svd_jet.cpp
>>> #include <ceres/jet.h>
>>> using ceres::Jet;
>>>
>>> int test_svd_jet()
>>> {
>>>     typedef Matrix<Jet<double, 2>, Dynamic, Dynamic> Mat;
>>>     typedef Matrix<Jet<double, 2>, Dynamic, 1> Vec;
>>>      Mat A = MatrixXd::Random(3, 2).cast <Jet<double, 2>>();
>>>      VectorXd b = VectorXd::Random(3);
>>>      JacobiSVD<Mat> svd(A, ComputeThinU | ComputeThinV);
>>>      int l_rank = svd.rank();
>>>      Vec c = svd.matrixV().leftCols(l_rank)
>>>          * svd.singularValues().head(l_rank).asDiagonal().inverse()
>>>          * svd.matrixU().leftCols(l_rank).adjoint() * b; // *
>>>      Vec c1 = svd.solve(b.cast<Jet<double, 2>>()); // **
>>>      Vec c2 = svd.solve(b); // ***
>>>      return 0;
>>> }
>>> // End test_svd_jet.cpp
>>>
>>> // * and // ** work fine an give the same results. // *** fails with VS
>>> 2019 error message
>>> Eigen\src\Core\functors\AssignmentFunctors.h(24,1):
>>> error C2679: binary '=': no operator found which takes
>>> a right-hand operand of type 'const SrcScalar'
>>> (or there is no acceptable conversion)
>>> The error points to line //***. I thing that solution is of type
>>> VectorXd instead of Vec and there is problem with assignment of double to
>>> Jet. Derivatives are lost either. It should work similar to complex type.
>>> If A is complex matrix and b is real vector, x must be complex. There is
>>> something wrong with Type deduction in SVD or QR decomposition.
>>>
>>> Do you have any idea of how to fix it.
>>>
>>> Best regards,
>>>
>>> Oleg Shirokobrod
>>>
>>>

Reply via email to