Folks,
I got a project where I have a large number of variables and an
outcome and I need to figure out which 20% of the variables has the
largest effect on the outcome. Of course I also need to optimize the
20% of variables I end up with.
This sounds like a job for a neural network to me, with arguments
possibly optimized through genetic algorithms. I'm wondering, though,
if I'm complicating things for myself and there's an easier approach
to this. If not I'm wondering if there are ready-made NN or GA
libraries for Haskell.
Also, I'm curious if Haskell is really the best language for this
type of problem and if lazy evaluation brings any specific advantages
to the solution or would be a hindrance.
I would welcome any pointers and feedback, yes, someone is actually
paying me to do this :-).
Thanks, Joel
--
http://wagerlabs.com/
_______________________________________________
Haskell-Cafe mailing list
Haskell-Cafe@haskell.org
http://www.haskell.org/mailman/listinfo/haskell-cafe