yuhao yang created SPARK-11579:
----------------------------------

             Summary: Method SGDOptimizer and LBFGSOptimizer in 
FeedForwardTrainer should not create new optimizer every time they got invoked
                 Key: SPARK-11579
                 URL: https://issues.apache.org/jira/browse/SPARK-11579
             Project: Spark
          Issue Type: Improvement
          Components: ML
    Affects Versions: 1.6.0
            Reporter: yuhao yang
            Priority: Minor


This is just a small proposal based on some customer feedback. I can send a PR 
if it looks reasonable.

Currently method SGDOptimizer and LBFGSOptimizer in FeedForwardTrainer create 
new optimizer every time they got invoked, this is not quite intuitive since 
users think they are still using the existing optimizer when they write: 

    feedForwardTrainer
      .SGDOptimizer
      .setMiniBatchFraction(0.002)

yet it actually creates a new optimizer without other properties which were set 
previously.

A straight-forward solution is to avoid create new optimizer when current 
optimizer is already of the same kind.

if (!optimizer.instanceof[LBFGS]) 
    optimizer = new ...





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to