[GitHub] [tvm] ANSHUMAN87 commented on a change in pull request #7267: [Frontend][Tensorflow] Sparse dense matmul adjoint option added

2021-01-26 Thread GitBox


ANSHUMAN87 commented on a change in pull request #7267:
URL: https://github.com/apache/tvm/pull/7267#discussion_r565007656



##
File path: python/tvm/relay/frontend/tensorflow.py
##
@@ -941,9 +934,48 @@ def _impl(inputs, attr, params, mod):
 (values_tensor, (rows, cols)), 
shape=tuple(dense_shape_tensor.tolist())
 )
 
+# As per tensorflow implementation, we have 4 possible input 
combination
+# and the first input(A) is always sparse and second input(B) is 
always dense.
+# Case 1: A , B , adjoint_a=False, adjoint_b=False  --> A * B
+# Case 2: A , B , adjoint_a=True,   adjoint_b=False  --> A.T * B
+# Case 3: A , B , adjoint_a=False, adjoint_b=True--> A * B.T
+# Case 4: A , B , adjoint_a=True,   adjoint_b=True--> (A.T * B.T).T
+#
+# Topi implementation for sparse_dense(matmul) has 2 possible input
+# combination where first input(A) is always dense
+# and second input(B) is always sparse.
+# Case 1: A , B, sparse_lhs = False  --> A * B.T
+# Case 2: A , B, sparse_lhs = True--> B * A.T
+#
+# The mapping would be as below:
+# TF Case 1: A , B , adjoint_a=False, adjoint_b=False

Review comment:
   Done!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [tvm] ANSHUMAN87 commented on a change in pull request #7267: [Frontend][Tensorflow] Sparse dense matmul adjoint option added

2021-01-26 Thread GitBox


ANSHUMAN87 commented on a change in pull request #7267:
URL: https://github.com/apache/tvm/pull/7267#discussion_r564993711



##
File path: python/tvm/relay/frontend/tensorflow.py
##
@@ -941,8 +934,47 @@ def _impl(inputs, attr, params, mod):
 (values_tensor, (rows, cols)), 
shape=tuple(dense_shape_tensor.tolist())
 )
 
-if sparse_lhs:
+# As per tensorflow implementation, we have 4 possible input 
combination
+# and the first input(A) is always sparse and second input(B) is 
always dense.
+# Case 1: A , B , adjoint_a=False, adjoint_b=False  --> A * B
+# Case 2: A , B , adjoint_a=True,   adjoint_b=False  --> A.T * B
+# Case 3: A , B , adjoint_a=False, adjoint_b=True--> A * B.T
+# Case 4: A , B , adjoint_a=True,   adjoint_b=True--> (A.T * B.T).T
+#
+# Topi implementation for sparse_dense(matmul) has 2 possible input
+# combination where first input(A) is always dense
+# and second input(B) is always sparse.
+# Case 1: A , B, sparse_lhs = False  --> A * B.T
+# Case 2: A , B, sparse_lhs = True--> B * A.T
+#
+# The mapping would be as below:
+# TF Case 1: A , B , adjoint_a=False, adjoint_b=False
+#   --> sparse_dense(transpose(B), A, sparse_lhs=True)
+#
+# TF Case 2: A , B , adjoint_a=True, adjoint_b=False
+#   --> sparse_dense(transpose(B), transpose(A), 
sparse_lhs=True)
+#
+# TF Case 3: A , B , adjoint_a=False, adjoint_b=True
+#   --> sparse_dense(B, A, sparse_lhs=True)
+#
+# TF Case 4: A , B , adjoint_a=True, adjoint_b=True
+#   --> transpose(sparse_dense(B, transpose(A), 
sparse_lhs=False))
+
+# By default, in tensorflow the first input ,i.e., data is sparse
+sparse_lhs = True
+
+# TF Case 1:
+if not attr.get("adjoint_a") and not attr.get("adjoint_b"):
+data = _op.transpose(data)
+# TF Case 2:
+elif attr.get("adjoint_a") and not attr.get("adjoint_b"):
 data = _op.transpose(data)
+weight_sp = csr_matrix(weight_sp.transpose())
+# TF Case 3:
+elif not attr.get("adjoint_a") and attr.get("adjoint_b"):
+pass
+# TF Case 4:
+# attr.get("adjoint_a") and attr.get("adjoint_b"):
 else:
 weight_sp = csr_matrix(weight_sp.transpose())

Review comment:
   Good catch!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [tvm] ANSHUMAN87 commented on a change in pull request #7267: [Frontend][Tensorflow] Sparse dense matmul adjoint option added

2021-01-24 Thread GitBox


ANSHUMAN87 commented on a change in pull request #7267:
URL: https://github.com/apache/tvm/pull/7267#discussion_r563293629



##
File path: python/tvm/relay/frontend/tensorflow.py
##
@@ -941,9 +934,48 @@ def _impl(inputs, attr, params, mod):
 (values_tensor, (rows, cols)), 
shape=tuple(dense_shape_tensor.tolist())
 )
 
+# As per tensorflow implementation, we have 4 possible input 
combination
+# and the first input(A) is always sparse and second input(B) is 
always dense.
+# Case 1: A , B , adjoint_a=False, adjoint_b=False  --> A * B
+# Case 2: A , B , adjoint_a=True,   adjoint_b=False  --> A.T * B
+# Case 3: A , B , adjoint_a=False, adjoint_b=True--> A * B.T
+# Case 4: A , B , adjoint_a=True,   adjoint_b=True--> (A.T * B.T).T
+#
+# Topi implementation for sparse_dense(matmul) has 2 possible input
+# combination where first input(A) is always dense
+# and second input(B) is always sparse.
+# Case 1: A , B, sparse_lhs = False  --> A * B.T
+# Case 2: A , B, sparse_lhs = True--> B * A.T
+#
+# The mapping would be as below:
+# TF Case 1: A , B , adjoint_a=False, adjoint_b=False

Review comment:
   This part is already explained in the above comment section, it is 
implicit. I think it is okay.





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [tvm] ANSHUMAN87 commented on a change in pull request #7267: [Frontend][Tensorflow] Sparse dense matmul adjoint option added

2021-01-24 Thread GitBox


ANSHUMAN87 commented on a change in pull request #7267:
URL: https://github.com/apache/tvm/pull/7267#discussion_r563293446



##
File path: python/tvm/relay/frontend/tensorflow.py
##
@@ -942,7 +942,10 @@ def _impl(inputs, attr, params, mod):
 )
 
 if sparse_lhs:
-data = _op.transpose(data)
+if attr.get("adjoint_a"):

Review comment:
   Done!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [tvm] ANSHUMAN87 commented on a change in pull request #7267: [Frontend][Tensorflow] Sparse dense matmul adjoint option added

2021-01-22 Thread GitBox


ANSHUMAN87 commented on a change in pull request #7267:
URL: https://github.com/apache/tvm/pull/7267#discussion_r562791091



##
File path: python/tvm/relay/frontend/tensorflow.py
##
@@ -942,7 +942,10 @@ def _impl(inputs, attr, params, mod):
 )
 
 if sparse_lhs:
-data = _op.transpose(data)
+if attr.get("adjoint_a"):

Review comment:
   @tkonolige : I have added few comments, would you please take a look. 
Thanks!





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org




[GitHub] [tvm] ANSHUMAN87 commented on a change in pull request #7267: [Frontend][Tensorflow] Sparse dense matmul adjoint option added

2021-01-22 Thread GitBox


ANSHUMAN87 commented on a change in pull request #7267:
URL: https://github.com/apache/tvm/pull/7267#discussion_r562744658



##
File path: python/tvm/relay/frontend/tensorflow.py
##
@@ -942,7 +942,10 @@ def _impl(inputs, attr, params, mod):
 )
 
 if sparse_lhs:
-data = _op.transpose(data)
+if attr.get("adjoint_a"):

Review comment:
   Yes i agree the logic is not straight forward, however it is to match 
the operation towards Topi Op.
   I think matching series of If may not be good as it will branch out complete 
code if we want to tag with each case.
   But i can try putting some comment over the current code, so that it will 
help to understand logic better. Please let me know if you think otherwise. 





This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org