dcslin edited a comment on issue #696:
URL: https://github.com/apache/singa/issues/696#issuecomment-628355516
Considering the API requirement, and constraints as below:
API requirement:
1. [**Model**] multi-input/output (multi loss fn)
2. [**Model**] Load model from disk, in other words: Model param memory
allocation should be done in `model.__init__`
API constraints:
1. [**Model**] graph module buffer first forward call **or** turn off graph
module in the first forward call
2. [**Layer**] layer param memory allocation & initialization requires input
x
@XJDKC 's scheme 2 and @nudles 's `Placeholder` is close to current
implemenation, and changes required could be small.
For model building:
```python
class MyModel(Model):
def __init__(self, inputs, configs):
self.mylayer=MyLayer(configs)
self.linear1=Linear(configs, kernel_init=configs.ker_init)
super.__init__(inputs) # maybe a bit confuse for user what is this
def forward(self, inputs):
return linear1(mylayer(inputs[0], inputs[1]))
```
For model running:
```python
x=PlaceHolder(shape=(batch, shape1, shape2))
m=MyModel([x],**configs,**graph_configs)
m.on_device(gpu)
m.train()
for e in epochs:
for x, y in data_gen:
losses = m.loss(y, m(x))
m.optim(l1)
m.optim(l2)
```
For Layer building:
```python
class MyLayer(Layer):
def __init__(self, configs):
self.configs = configs
def __call__(self, inputs):
if not self.init:
self.W = Tensor(self.configs, inputs.shape).initializer()
self.device_check(inputs, self.W)
self.init=True
return = op1(inputs[0], inputs[1])
```
For Module class impl:
```python
class Module:
def __init__(self, placeholder_input, configs):
turn_off_graph()
self.forward(*placeholder_input)
turn_on_graph()
def __call__(self,inputs):
return self.forward(*inputs)
```
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]