Sundrops opened a new issue #19657:
URL: https://github.com/apache/incubator-mxnet/issues/19657


   > This operator accepts a customized loss function symbol as a terminal loss 
and the symbol should be an operator with no backward dependency. The output of 
this function is the gradient of loss with respect to the input data.
   
   The description of ndarray.make_loss is same as the description of 
symbol.make_loss. And it only explains symbol, not ndarray. 
   I want to know what `F.make_ loss()` will do when I use `net.hybridize()` 
and `loss.backward()`.
   
   
https://mxnet.apache.org/versions/1.7.0/api/python/docs/api/ndarray/ndarray.html?highlight=make_loss#mxnet.ndarray.make_loss
   
https://mxnet.apache.org/versions/1.7.0/api/python/docs/api/symbol/symbol.html#mxnet.symbol.make_loss
   
   ```python
   class Myloss(mx.gluon.nn.HybridBlock):
       def __init__(self):
           super(Myloss, self).__init__()
       def hybrid_forward(self, F, pred, gt):
           loss_l2 = F.sum(F.square(pred - gt), axis=1) / 2
           return F.make_loss(loss_l2)
   net=resnet()
   net.hybridize()
   x = net(input)
   myloss = Myloss()
   loss = myloss(x, y)
   loss.backward()
   ```
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to