I have added lpx_scale_prob(lp) and lpx_set_int_parm(lp, LPX_K_PRESOL, 1); to my code and things seem to have stabilised somewhat and some large data sets that were originally problematic can be solved now. There are, however, some huge data sets that remain problematic. A file with problem data in fixed MPS format will be on its way soon.
For now, if I do not get a solution after 15 mins or if the "numerical instability" error message appears, I restart the process. Is this reasonbale? And if after a long time the "objval" is still at an extremely small number, say 1.5 when I expect the solution to be around 15, should I restart the process? Is the final results scaled in some way? That is, should I multiply the "OPTIMAL SOLUTION FOUND" say "Z=1.000" by a constant factor? -- This message was sent on behalf of [EMAIL PROTECTED] at openSubscriber.com http://www.opensubscriber.com/message/help-glpk@gnu.org/3773206.html _______________________________________________ Help-glpk mailing list Help-glpk@gnu.org http://lists.gnu.org/mailman/listinfo/help-glpk