pilhoon commented on issue #2932: How I can set different 'lr_mult' for weight and bias in one layer? URL: https://github.com/apache/incubator-mxnet/issues/2932#issuecomment-487824087 It looks some changes are. You can use a simple `var` ```python w=mx.sym.var(lr_mult=0.1) b=mx.sym.var(lr_mult=0.1) fc1=mx.symbol.Fullyconnected(data=data,name='fc1',num_hidden=64, weight=w, bias=b) ```
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org With regards, Apache Git Services