[ 
https://issues.apache.org/jira/browse/SINGA-476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

zhangzhaoqi updated SINGA-476:
------------------------------
    Description: 
For the demo purpose, we need to implement these three models, and these are 
their components:
h2. [Tiny yolov2|https://arxiv.org/pdf/1612.08242.pdf]

Add
BatchNormalization
Conv
LeakyRelu
MaxPool
Mul
h2. [Arcface|https://arxiv.org/pdf/1801.07698.pdf]

Acos
Add
BatchNormalization
Conv
Cos
Dropout
Flatten
Gemm
Identity
InstanceNormalization
LpNormalization
Mul
PRelu
Reshape
Sub
h2. [BIDAF|https://arxiv.org/pdf/1611.01603.pdf]

Abs
Add
Add
ArgMax
Cast
Ceil
Clip
Compress
Concat
ConstantOfShape
Conv
Dropout
Gather
Hardmax
Log
LSTM
MatMul
ReduceMax
ReduceSum
Relu
Shape
Sigmoid
Slice
Squeeze
Sub
Sum
Transpose
Unsqueeze

 

In summary, we already implemented 13 ops, and they're still 27 ops needed to 
be implemented:
h2. Already implemented:

-Acos-
-BatchNormalization-
-Cos-
-Conv-
-LeakyRelu-
-LSTM-
-Abs-
-MaxPool-
-Flatten-
-Add-
-MatMul-
-Relu-
-Sigmoid-
h2. To be implemented:

ArgMax
Cast
Ceil
Clip
Compress
Concat
ConstantOfShape
Dropout
Gather
Gemm
Hardmax
Identity
InstanceNormalization
Log
LpNormalization
Mul
PRelu
ReduceMax
ReduceSum
Reshape
Shape
Slice
Squeeze
Sub
Sum
Transpose
Unsqueeze

Please refer to the [ONNX Operator 
Schemas|[https://github.com/onnx/onnx/blob/master/docs/Operators.md]] for more 
detailed information.

  was:
For the demo purpose, we need to implement these three models, and these are 
their components:
h2. [Tiny yolov2|https://arxiv.org/pdf/1612.08242.pdf]

MaxPooling2D
 Conv2D
 BatchNormalization
 LeakyReLU
 Reshape
h2. [Arcface|https://arxiv.org/pdf/1801.07698.pdf]

Conv2D
 BatchNormalization
 relu
 MaxPooling2D
 Dropout
 Flatten
 Dense
 Softmax
 l2_normalize
 acos
 cos
h2. [BIDAF|https://arxiv.org/pdf/1611.01603.pdf]

K.stack
 Softmax
 K.expand_dims
 K.sum
 Constant
 Dense
 Lambda(lambda x: 1.0 - x, output_shape=(dim,))
 Multiply
 Add
 K.concatenate
 K.shape
 K.max
 K.tile
 K.squeeze
 linear
 TimeDistributed
 Bidirectional(LSTM

 

 

In summary, we already implemented 12 ops, and there still are 16 ops needed to 
be implemented:
h2. Already implemented:

-LSTM-
 -Multiply-
 -Add-
 -linear-
 -relu-
 -acos-
 -cos-
 -LeakyReLU-
 -Softmax-
 -MaxPooling2D-
 -Conv2D-
 -BatchNormalization-
h2. To be implemented:

Reshape
 Flatten
 Dropout
 max
 shape
 concatenate
 Constant
 L2Normalization
 Expand
 tile
 squeeze
 Dense*
 TimeDistributed*
 Bidirectional*
 Stack*
 Lambda*

*means this op doesn't have a corresponding one at ONNX op sets, therefore, it 
needs a converter function by using basic op sets.

 


> Autograd operators for ONNX
> ---------------------------
>
>                 Key: SINGA-476
>                 URL: https://issues.apache.org/jira/browse/SINGA-476
>             Project: Singa
>          Issue Type: New Feature
>            Reporter: zhangzhaoqi
>            Priority: Critical
>         Attachments: arcface(based resnet100).png, bidaf.png, tiny_yolov2.png
>
>
> For the demo purpose, we need to implement these three models, and these are 
> their components:
> h2. [Tiny yolov2|https://arxiv.org/pdf/1612.08242.pdf]
> Add
> BatchNormalization
> Conv
> LeakyRelu
> MaxPool
> Mul
> h2. [Arcface|https://arxiv.org/pdf/1801.07698.pdf]
> Acos
> Add
> BatchNormalization
> Conv
> Cos
> Dropout
> Flatten
> Gemm
> Identity
> InstanceNormalization
> LpNormalization
> Mul
> PRelu
> Reshape
> Sub
> h2. [BIDAF|https://arxiv.org/pdf/1611.01603.pdf]
> Abs
> Add
> Add
> ArgMax
> Cast
> Ceil
> Clip
> Compress
> Concat
> ConstantOfShape
> Conv
> Dropout
> Gather
> Hardmax
> Log
> LSTM
> MatMul
> ReduceMax
> ReduceSum
> Relu
> Shape
> Sigmoid
> Slice
> Squeeze
> Sub
> Sum
> Transpose
> Unsqueeze
>  
> In summary, we already implemented 13 ops, and they're still 27 ops needed to 
> be implemented:
> h2. Already implemented:
> -Acos-
> -BatchNormalization-
> -Cos-
> -Conv-
> -LeakyRelu-
> -LSTM-
> -Abs-
> -MaxPool-
> -Flatten-
> -Add-
> -MatMul-
> -Relu-
> -Sigmoid-
> h2. To be implemented:
> ArgMax
> Cast
> Ceil
> Clip
> Compress
> Concat
> ConstantOfShape
> Dropout
> Gather
> Gemm
> Hardmax
> Identity
> InstanceNormalization
> Log
> LpNormalization
> Mul
> PRelu
> ReduceMax
> ReduceSum
> Reshape
> Shape
> Slice
> Squeeze
> Sub
> Sum
> Transpose
> Unsqueeze
> Please refer to the [ONNX Operator 
> Schemas|[https://github.com/onnx/onnx/blob/master/docs/Operators.md]] for 
> more detailed information.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to