nudles commented on issue #696:
URL: https://github.com/apache/singa/issues/696#issuecomment-628364838


   > 
   > 
   > > 1. Before we call Module.forward(), we can randomly fill the placeholder 
tensors.
   > > 2. We can make Layer.init() optional. To implement a new layer, the 
parameter initialization can be done within the `__call__` method or in a 
`init()` method. It is up to the contributor.
   > > 
   > > Any comments on the drawbacks?
   > > @dcslin @XJDKC
   > 
   >     1. If we separate the initialization and the forward propagation, 
there is no need to fill the placeholder. The data of tensors will only be 
accessed in the forward propagation. For initialization, we just access their 
shapes, types and so on.
   Then we need a method like `infer_output_shapes(self, input_shapes)`; 
otherwise, we have to call the forward method to get the output shapes from the 
output tensor. I prefer to call the forward function to avoid adding a new 
method to each layer.  
   > 
   >     2. That's great. But if users move the initialization into __call__ 
function, they should determine whether the layer has been initialized by 
themselves.
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to