zachgk commented on issue #16010: DropConnect Layer
URL: 
https://github.com/apache/incubator-mxnet/issues/16010#issuecomment-533298429
 
 
   The tensorflow workaround you gave is to call dropout on the weight layer 
and then multiply by the probability for the dropout layer to undo the weight 
change. This should work in MXNet and other frameworks as well

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to