nudles commented on issue #691:
URL: https://github.com/apache/singa/issues/691#issuecomment-629606349


   Pls check my inline comments starting with **
   
   > 
   > 
   > ```
   > # handle ONNX 
   > def to_onnx(model):
   >     return a onnx model 
   > 
   > class SONNXModel(Module):
   >     def __init__(self, onnx_model):  ** need to pass the dev as an 
argument.
   >         singa_rep = sonnx.prepare(onnx_model, device=dev, batchsize=1)
   >         for layer_name, layer in singa_rep.layers:
   >             self.__dict__[layer_name] = layer
   >         # store weights here as numpy
   >         for weith_name, weight in singa_rep.weights:
   >             self.weights[weith_name] = weight
   >         # store layer info such as input and output name(only weights)
   >         for layer_name, layer_info in singa_rep.layer_infos:
   >             self.layer_infos[layer_name] = layer_info
   > 
   >     def forward(self, aux_output):
   >         # run forward according to onnx graph 
   >         return the last output + aux_output
   > 
   >     def compile(self)   ** shall we just move the code of compile into 
__init__. For subclass of SONNXModel, we do not expect users to call compile() 
explicitly.  
   >         # init weights
   >         super.compile(self)   ** args like dev, use_graph, graph_alg 
should be passed.
   >         # set weights' value
   >         for layer_name, layer in self.__dict__:
   >             input_info, output_info = self.layer_infos[layer_name]
   >             for input_name in input_info:
   >                 layer.set_weight(self.weights[input_name])   ** remember 
to release self.weights to free memory.
   > 
   > class MyModel(SONNXModel):
   >      def __init__(self, onnx):
   >           super.__init__(onnx)
   >           self.layer1 = Conv()
   >           self.layer2 = Conv()
   > 
   >      def forward(self, x):
   >            x1, x2 = super.forward(x, aux_output)
   >            x = self.layer1.forward(x2)
   >            return self.layer2.forward(x1) + x
   > 
   >       def train_one_batch(self, x, y):
   >            y_ = self.forward(x)
   >            ....
   > ```
   > 
   > How about this one, we pareses onnx by `soon.prepare`(Backend), it returns 
a `singa_rep`(BackendRep), and the singa_rep contains the layers, weights and 
input_output_info, we store the layers in `self.__dict__`. When we compile the 
model, first we call super() to init the params, then we set its value from the 
onnx loaded weights.
   
   It's good to reuse singa_rep.
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to