dmas-at-wiris opened a new issue #10073: NaN in loss when using gluon ELU block
URL: https://github.com/apache/incubator-mxnet/issues/10073
 
 
   I have a CNN that uses ELUs as activation function. When I use the gluon 
implementation (`F.where(x > 0, x, self._alpha * (F.exp(x) - 1.0))` from 
[here](https://github.com/apache/incubator-mxnet/blob/master/python/mxnet/gluon/nn/activations.py#L161))
 I obtain NaN values in the loss after few epochs. 
   
   However, if I create a block with this `F.LeakyReLU(x, act_type='elu', 
alpha)` the loss doesn't diverge.  Any idea?
   
   I'll try to provide a minimal working example asap, if necessary.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to