Thrsu opened a new issue, #15020:
URL: https://github.com/apache/tvm/issues/15020
I attempted to convert a Keras Conv2DTranspose model to a TFLite model and
encountered a TypeError when loading it. The error message stated that a
bytes-like object was required, but an integer was provided instead.
### Actual behavior
```
Traceback (most recent call last):
File "single_test.py", line 40, in <module>
mod, params = relay.frontend.from_tflite(tflite_model, shape_dict)
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py",
line 4205, in from_tflite
op_converter.convert_op_to_relay()
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py",
line 276, in convert_op_to_relay
ret = self.convert_map[op_code_str](op)
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py",
line 3105, in convert_space_to_batch_nd
paddings = self.get_tensor_value(input_tensors[2]).tolist()
File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/tflite.py",
line 466, in get_tensor_value
return np.frombuffer(data, dtype=dtype).reshape(shape)
TypeError: a bytes-like object is required, not 'int'
```
What actually happened
### Environment
- TVM: 0.13.dev0
- Tensorflow: 2.12.0
- TFlite: 2.10.0
- Keras: 2.12.0
### Steps to reproduce
```
from tensorflow import keras
import numpy as np
import tensorflow as tf
from tensorflow.keras import layers, models
import tflite
import tvm
import tvm.relay as relay
np.random.seed(2023)
input_shape = (1, 5, 6, 3)
input_data = 10 * np.random.random(list(input_shape))
input_data -= 0.5
input_data = input_data.astype("float32")
kwargs={ "filters": 2, "kernel_size": 3, "padding": "same", "data_format":
"channels_last", "dilation_rate": (2, 2), }
layer_cls = keras.layers.Conv2DTranspose
layer = layer_cls(**kwargs)
weights = layer.get_weights()
layer.set_weights(weights)
x = layers.Input(shape=input_shape[1:], dtype="float32")
y = layer(x)
model = models.Model(x, y)
converter = tf.lite.TFLiteConverter.from_keras_model(model)
tflite_model = converter.convert()
interpreter = tf.lite.Interpreter(model_content=tflite_model)
interpreter.allocate_tensors()
input_details = interpreter.get_input_details()
input_name = input_details[0]['name'].split(':')[0]
target = 'llvm'
ctx = tvm.cpu(0)
shape_dict = {input_name: input_shape}
tflite_model = tflite.Model.GetRootAsModel(tflite_model, 0)
mod, params = relay.frontend.from_tflite(tflite_model, shape_dict)
with tvm.transform.PassContext(opt_level=3):
model = relay.build_module.create_executor('vm', mod, ctx, target,
params).evaluate()
tvm_output = model(tvm.nd.array(input_data)).numpy()
```
### Triage
- needs-triage
- frontend:tflite
### Analysis
In the function `get_tensor_value`, if
`self.has_expr(tensor_wrapper.tensor_idx)` is true, a `TypeError` will be
raised with the message "a bytes-like object is required, not 'int'".
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]