[GitHub] [incubator-mxnet] samskalicky commented on issue #16708: Training an FPN model using grad_req="add" causes rapid divergence, while manually implemented gradient accumulation works fine

2019-11-11 Thread GitBox
samskalicky commented on issue #16708: Training an FPN model using grad_req="add" causes rapid divergence, while manually implemented gradient accumulation works fine URL: https://github.com/apache/incubator-mxnet/issues/16708#issuecomment-552585909 @zachgk assign @szha ---

[GitHub] [incubator-mxnet] samskalicky commented on issue #16708: Training an FPN model using grad_req="add" causes rapid divergence, while manually implemented gradient accumulation works fine

2019-11-11 Thread GitBox
samskalicky commented on issue #16708: Training an FPN model using grad_req="add" causes rapid divergence, while manually implemented gradient accumulation works fine URL: https://github.com/apache/incubator-mxnet/issues/16708#issuecomment-552611785 @zhreshold any update on this issue? C