Hello Rahul,

NEAT is somewhat special, in the sense that it doesn't use a static function for
the optimization but it also evolves the function. So you are right you could
use NEAT as an independent method/optimizer in itself, on the other side you
could transform e.g. the logistic regression problem into a problem that can be
solved via NEAT, just by predicting the LR model parameter instead of e.g.
learning the classification. This idea somewhat follows the idea behind the
"Learning to learn by gradient descent by gradient descent" paper.

I hope anything I said, was helpful, let me know if I should clarify anything.

Thanks,
Marcus

> On 4. Mar 2019, at 19:52, Rahul Prabhu <cupertin...@gmail.com> wrote:
> 
> Hi,
>     I was wondering about how NEAT would be used as an optimizer in the 
> ensmallen package? From my understanding of NEAT, it acts as a function 
> approximator more than an optimizer,and hence, in my view doesn't really fit 
> well with the rest of ensmallen. Am I looking at it wrong? If so, could you 
> give me a clue as to the right direction with this project?
> 
> Thanks,
> Rahul
> _______________________________________________
> mlpack mailing list
> mlpack@lists.mlpack.org
> http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

_______________________________________________
mlpack mailing list
mlpack@lists.mlpack.org
http://knife.lugatgt.org/cgi-bin/mailman/listinfo/mlpack

Reply via email to