slyubomirsky commented on a change in pull request #7006:
URL: https://github.com/apache/tvm/pull/7006#discussion_r533841208



##########
File path: python/tvm/relay/frontend/keras.py
##########
@@ -927,6 +1031,66 @@ def _convert_repeat_vector(inexpr, keras_layer, _):
     return out
 
 
+def _convert_time_distributed(inexpr, keras_layer, etab, input_shape=None, 
data_layout=None):
+    # TimeDistributed: split input tensor along the second dimension (assumed 
to be time),
+    # apply inner layer to each split individually,
+    # and then combine the results
+    if input_shape is None:
+        input_shape = keras_layer.input_shape
+    if data_layout is None:
+        data_layout = etab.data_layout
+
+    assert len(input_shape) >= 2, "Input to TimeDistributed must have at least 
two dimensions"
+
+    inner_layer = keras_layer.layer
+    inner_input_shape = [d for (i, d) in enumerate(input_shape) if i != 1]
+
+    # for NDHWC, inner data layout will drop the D
+    inner_data_layout = None
+    if data_layout == "NDHWC":
+        inner_data_layout = "NHWC"
+
+    # some code duplication from keras_op_to_relay
+    # but it's useful to avoid cluttering the etab
+    inner_layer_op_name = type(keras_layer.layer).__name__
+    if inner_layer_op_name not in _convert_map:
+        raise tvm.error.OpNotImplemented(
+            "The inner layer for TimeDistributed {} is not supported for 
frontend Keras.".format(
+                inner_layer_op_name
+            )
+        )
+
+    conversion_func = lambda expr: _convert_map[inner_layer_op_name](
+        expr, inner_layer, etab, input_shape=inner_input_shape, 
data_layout=inner_data_layout

Review comment:
       https://www.tensorflow.org/api_docs/python/tf/keras/models/clone_model I 
might be able to copy the inner layer to `TimeDistributed` and set its 
`input_shape` but I'm not sure whether it's possible. I will check




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to