szha commented on issue #13512:
URL:
https://github.com/apache/incubator-mxnet/issues/13512#issuecomment-688435787
> @szha Also, `clip_global_norm` function does a rescaling and not a
clipping. We just want want to ignore the gradients which are larger than some
threshold.
I see. So you want to clip the absolute value of each dimension in the
gradients instead of rescale based on global norm. You can do the following in
this case:
```python
for p in net.collect_params().values():
if p.grad_req != 'null':
for g in p.list_grad():
mx.nd.clip(g, MIN, MAX, out=g)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]