Github user bgreeven commented on the pull request:

    https://github.com/apache/spark/pull/1290#issuecomment-61749598
  
    Let's discuss a bit more about making the optimiser, updater, gradient, and 
error function customizable.
    
    Notice that for the current LBFGS algorithm, the error function is used 
both for the gradient (as the error function is minimized), which is used in 
the updater and optimizer. Hence for a pluggable error function, the gradient 
needs to be pluggable.
    
    I think there would be value in making the "updater" and "optimizer" 
pluggable too. For the optimizers we have already seen the candidates LBFGS and 
SGD, both with their pres and contras. Also, there may be other optimizers that 
use something else than the gradient. Since the updater currently depends on 
the gradient, I suggest to make it pluggable too. (I played around a bit with a 
genetic optimizer - doesn't work very well but is an example of an optimizer 
that doesn't use the gradient.)
    
    Maybe we can start with making the "optimizer", "gradient" and "updater" in 
the ArtificialNeuralNetwork class vars instead of vals. Then we can create a 
different ANN object for each "optimizer", "gradient" and "updater" 
combination, e.g. "ArtificialNeuralNetworkWithLBFGS". We also need to remove 
the convergenceTol from the ArtificialNeuralNetwork constructor, since that is 
LBFGS specific.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to