There appears to be a bug in gsl_linalg_complex_LU_decomp. It returns incorrect results even for matrices that only contain real values.
Demo: ``` #include <stdio.h> #include <gsl/gsl_linalg.h> #include <gsl_complex.h> #include <gsl_complex_math.h> int main (void) { gsl_matrix_complex* m = gsl_matrix_complex_alloc (3, 3); gsl_matrix_complex_set (m, 0, 0, (gsl_complex){ .dat = {1, 0} }); gsl_matrix_complex_set (m, 0, 1, (gsl_complex){ .dat = {2, 0} }); gsl_matrix_complex_set (m, 0, 2, (gsl_complex){ .dat = {3, 0} }); gsl_matrix_complex_set (m, 1, 0, (gsl_complex){ .dat = {4, 0} }); gsl_matrix_complex_set (m, 1, 1, (gsl_complex){ .dat = {5, 0} }); gsl_matrix_complex_set (m, 1, 2, (gsl_complex){ .dat = {6, 0} }); gsl_matrix_complex_set (m, 2, 0, (gsl_complex){ .dat = {7, 0} }); gsl_matrix_complex_set (m, 2, 1, (gsl_complex){ .dat = {8, 0} }); gsl_matrix_complex_set (m, 2, 2, (gsl_complex){ .dat = {9, 0} }); int signum = 0; gsl_permutation* perm = gsl_permutation_calloc (3); gsl_linalg_complex_LU_decomp (m, perm, &signum); for (size_t i = 0; i < 3; i ++) { for (size_t j = 0; j < 3; j ++) { printf ("(%0.2f, %0.2f)", GSL_REAL (gsl_matrix_complex_get (m, i, j)), GSL_IMAG (gsl_matrix_complex_get (m, i, j))); } } return 0; } ``` I compiled and ran this with: ``` gcc -I. -I/usr/local/Cellar/gsl/2.7.1/include/gsl -L /usr/local/Cellar/gsl/2.7.1/lib -lgsl -lgslcblas bug.c -o bug ./bug ``` The output is: > (7.00, 0.00)(8.00, 0.00)(9.00, 0.00)(0.14, 0.00)(0.86, 0.00)(1.71, > 0.00)(0.57, 0.00)(0.50, 0.00)(0.00, 0.00) The correct LU decomposition is: ``` (%i3) get_lu_factors (lu_factor (matrix ([1, 2, 3], [4, 5, 6], [7, 8, 9]))); [ 1 0 0 ] [ 1 0 0 ] [ 1 2 3 ] [ ] [ ] [ ] (%o3) [[ 0 1 0 ], [ 4 1 0 ], [ 0 - 3 - 6 ]] [ ] [ ] [ ] [ 0 0 1 ] [ 7 2 1 ] [ 0 0 0 ] ``` Please let me know if I made some mistake or, if this is a true bug, what the timeline for fixing it is. Thanks, - Larry