oleg-trott commented on issue #17684: The output of the ReLU layer in MXNET is 
different from that in tensorflow and cntk
URL: 
https://github.com/apache/incubator-mxnet/issues/17684#issuecomment-592997242
 
 
   @braindotai 
   
   > As given 
[here](https://mxnet.apache.org/api/python/docs/api/gluon/model_zoo/index.html) 
make sure that you are normalizing your image as below
   
   I don't think the normalization is the culprit, if the previous layer 
outputs match. Keras probably uses the same network weights with all backends.
   
   @Justobe 
   
   I don't have the other frameworks installed, so I can't reproduce this, but 
my suggestion is:
   
   check the inputs to `relu`. Since it's a very simple function
   
   ```
   x * (x > 0)
   ```
   
   it should be easy to check that the output is what it's supposed to be. If 
not, use the input and output to try to create a reproducible case that doesn't 
need other frameworks.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to