Re: [scikit-learn] Incremental learning in scikit-learn

2019-09-09 Thread Daniel Sullivan
Hey Farzana, The algorithm only keeps one batch in memory at a time. Between processing over each batch, SGD keeps a set of weights that it alters with each iteration of a data point or instance within a batch. This set of weights functions as the persisted state between calls of partial_fit. That

Re: [scikit-learn] Questions about partial_fit and the Incremental library in Sci-kit learn

2019-09-09 Thread Daniel Sullivan
based on the difference between the estimate and the target. How much the weights are changed depends on the loss function and learning rate you specify. On Mon, Sep 9, 2019 at 1:32 PM Farzana Anowar wrote: > On 2019-09-09 12:12, Daniel Sullivan wrote: > > Hi Farzana, > > > >

Re: [scikit-learn] Questions about partial_fit and the Incremental library in Sci-kit learn

2019-09-09 Thread Daniel Sullivan
Hi Farzana, If I understand your question correctly you're asking how the SGD classifier works incrementally? The SGD algorithm maintains a single set of weights and iterates through all data points one at a time in a batch. It adjusts its weights on each iteration. So to answer your question, it

Re: [scikit-learn] [Scikit-learn-general] Gradient Descent

2016-06-29 Thread Daniel Sullivan
(Sent to wrong mailing list, sorry for duplication) Hi Chaitanya, Yes, Stochastic Gradient Descents algorithm logic is written in Cython. The implementation can be viewed here: https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/linear_model/sgd_fast.pyx Hope that helps, Danny On