drivanov opened a new pull request #16398: Aggregated adamw update URL: https://github.com/apache/incubator-mxnet/pull/16398 ## Description ## MxNet operator for aggregated Adam update ## Checklist ## ### Essentials ### - [x] Changes are complete (i.e. I finished coding on this PR) - [x] All changes have test coverage: - Unit tests are added for small changes to verify correctness (e.g. adding a new operator) - [x] Code is well-documented: - [x] To the my best knowledge, examples are either not affected by this change, or have been fixed to be compatible with this change ### Changes ### - [x] New operator allows to make Adam update for multiple gradients in one kernel. tests, (and when applicable, API doc) - [x] Test, for previously used "single" Adam update modified and now it also includes the use of `clip_gradient` parameter and random variations for `lr`, `eta`, `wd`. ## Comments ##
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services