If I understand your question, it appears that you are seeking a solution to a
massively underdetermined system of linear equations, Ax=y for which the x
vector is sparse with most elements being zero.
I spent 2013-2016 reading "A Mathematical Introduction to Compressive Sensing"
by Foucart and Rauhut. Which required I read Mallat's "A Wavelet Tour of Signal
Processing" and then reread Foucart and Rauhut. Happily, it is easier to do
than it is to understand. FWIW the 3 years included reading all the original
papers for a total of around 3000 pages. Non-trivial effort, but worth it just
for the magic of it all. Best thing since Norbert Wiener invented DSP.
The seminal paper is Donoho's 2004-9 proof that iff a sparse L1 solution to
Ax=y exists it is the L0 solution. The requirement is that A possess the
Restricted Isometry Property. No combinations of any columns of A are
correlated. That is unavoidably L0. So all you can do is try to solve Ax=y
using linear programming. You are not guaranteed a solution, but if you get one
it is probably the L0 solution.
I did quite a lot of work solving heat flow equations. I used awk to generate
huge GMPL files which I then ran with the command line solver. I did tens of
thousands of runs. Worked brilliantly.
Have Fun!
Reg
On Friday, May 16, 2025 at 04:55:34 PM CDT, Federico Miyara
<[email protected]> wrote:
I need to solve the following problem:
I have an alphabet of n symbols and a dictionary with N words of m
symbols (n in the order of tens, N in the order of tens of thousands, m
= 4, say)
Assuming each symbol has a definite probability, I need to generate a
list of M words (M in the order of 100) taken from the dictionary in
which the proportion of each symbol matches as best as possible the
required probability.
Is this a problem that can be solved using GLPK?
Thanks.
Bes regards,
Federico Miyara
--
Este correo electrónico ha sido analizado en busca de virus por el software
antivirus de Avast.
www.avast.com