It's great to see the proposal has a list of models that this API should
cover. But note that the D2L book has a very simplified train function (I
wrote it). It's oversimplified compared to what we are using in real life.
Kaggle competition solutions and popular github projects are closer to what
FYI, I have created a branch on the repo to facilitate multiple
collaborators for this feature.
https://github.com/apache/incubator-mxnet/tree/fit-api, they'll create PRs
to this branch and once the api is feature complete, i will rebase and
merge to master to preserve commit history
On Sun, Feb
STOP
On Sun, Feb 10, 2019 at 10:43 PM Hagay Lupesko wrote:
> Wanted to chime in as well.
> I have reviewed the design shared in the mail offline with Ankit, Lai and
> Naveen (we work in the same team in Amazon).
>
> I think it does a good job at simplifying many low-complexity training use
>
Wanted to chime in as well.
I have reviewed the design shared in the mail offline with Ankit, Lai and
Naveen (we work in the same team in Amazon).
I think it does a good job at simplifying many low-complexity training use
cases, which can make MXNet/Gluon even more friendly to so-called "deep
Hi Alfredo,
Thanks for your comments, I really like all your suggestions. Here are my
answers let me know if it makes sense or have comments.
1) The fit API is targeting novice users covering about 80% of the use
cases listed in the document. For advanced users,
and complex models, we (Naveen,
This is great and something we should all be able to benefit from.
There are just three pieces I’d like to advocate for that I feel are
shortcomings of some competing APIs on other frameworks (eg; TF Estimators)
and I would love to see in this proposal:
1) Make serialization/deserialization of
Hello dev@,
Training a model in Gluon requires users to write the training loop, this is
useful because of its imperative nature, however repeating the same code across
multiple models can become tedious and repetitive with boilerplate code. The
training loop can also be overwhelming to some