
On 9/26/05, Joel Reymont
Folks,
I got a project where I have a large number of variables and an outcome and I need to figure out which 20% of the variables has the largest effect on the outcome. Of course I also need to optimize the 20% of variables I end up with.
This sounds like a job for a neural network to me, with arguments possibly optimized through genetic algorithms. I'm wondering, though, if I'm complicating things for myself and there's an easier approach to this. If not I'm wondering if there are ready-made NN or GA libraries for Haskell.
Also, I'm curious if Haskell is really the best language for this type of problem and if lazy evaluation brings any specific advantages to the solution or would be a hindrance.
Check this paper - it seems they solved a similar problem with a hill-climbing algorithm: http://www.cs.uu.nl/dazzle/f08-schrage.pdf I would welcome any pointers and feedback, yes, someone is actually
paying me to do this :-).
Thanks, Joel