Hi! 
I'm dealing with nested learning. Right now I'm stuck with a problem that I 
need to reinitialize network's weights using new values in a such way that I 
would still be able to differentiate outputs by these new values (via 
*mxnet.autograd.grad*). What is a feasible way to do it under constraints of 
using *mxnet.autograd.record*? As using *mxnet.autograd.record* doesn't allow 
to use any inplace operations (*dst[:]=src* or *set_data*) and using a basic 
*initialize* method leads to making variables "unreachable from the outputs" in 
a graph.





---
[Visit 
Topic](https://discuss.mxnet.io/t/reinitialize-networks-weights-to-be-able-to-differentiate/6541/1)
 or reply to this email to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click 
here](https://discuss.mxnet.io/email/unsubscribe/1cca423612511a194c251ba2b6f4734afb9e2d72072c17244057ba2941945a97).

Reply via email to