Theoo1997 opened a new issue, #12595:
URL: https://github.com/apache/tvm/issues/12595

   Hi, 
   I am trying to execute int8 tf lite models on x86 (my local PC). So I've 
used micro TVM and specifically the 
[example](https://tvm.apache.org/docs/how_to/work_with_microtvm/micro_tflite.html#microtvm-with-tflite)
 you have for this peruse. I run the example without any problem but when I 
tried to run tf lite int8 from tfhub an error accrued. To be more specific I 
tried modilenet and efficientnet and both of this modelhad the above error. 
   ```
   Traceback (most recent call last):
     File "micro_tflite.py", line 330, in <module>
       graph_mod = tvm.micro.create_local_graph_executor(
     File "session.py", line 221, in create_local_graph_executor
       fcreate(graph_json_str, mod, lookup_remote_linked_param, *device_type_id)
     File "packed_func.py", line 237, in __call__
       raise get_last_ffi_error()
   tvm.error.RPCError: Traceback (most recent call last):
     13: TVMFuncCall
     12: 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::__mk_TVM0::{lambda(tvm::runtime::TVMArgs,
 tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, 
tvm::runtime::__mk_TVM0, tvm::runtime::TVMRetValue)
     11: tvm::runtime::GraphExecutorCreate(std::string const&, 
tvm::runtime::Module const&, std::vector<DLDevice, std::allocator<DLDevice> > 
const&, tvm::runtime::PackedFunc)
     10: tvm::runtime::GraphExecutor::Init(std::string const&, 
tvm::runtime::Module, std::vector<DLDevice, std::allocator<DLDevice> > const&, 
tvm::runtime::PackedFunc)
     9: tvm::runtime::GraphExecutor::SetupStorage()
     8: tvm::runtime::NDArray::Empty(tvm::runtime::ShapeTuple, DLDataType, 
DLDevice, tvm::runtime::Optional<tvm::runtime::String>)
     7: tvm::runtime::RPCDeviceAPI::AllocDataSpace(DLDevice, int, long const*, 
DLDataType, tvm::runtime::Optional<tvm::runtime::String>)
     6: tvm::runtime::RPCClientSession::AllocDataSpace(DLDevice, int, long 
const*, DLDataType, tvm::runtime::Optional<tvm::runtime::String>)
     5: 
tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::RPCEndpoint::Init()::{lambda(tvm::runtime::TVMArgs,
 tvm::runtime::TVMRetValue*)#2}> >::Call(tvm::runtime::PackedFuncObj const*, 
tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
     4: tvm::runtime::RPCEndpoint::HandleUntilReturnEvent(bool, 
std::function<void (tvm::runtime::TVMArgs)>)
     3: tvm::runtime::RPCEndpoint::EventHandler::HandleNextEvent(bool, bool, 
std::function<void (tvm::runtime::TVMArgs)>)
     2: 
tvm::runtime::RPCEndpoint::EventHandler::HandleProcessPacket(std::function<void 
(tvm::runtime::TVMArgs)>)
     1: 
tvm::runtime::RPCEndpoint::EventHandler::HandleReturn(tvm::runtime::RPCCode, 
std::function<void (tvm::runtime::TVMArgs)>)
     0: _ZN3tvm7runtime6deta
     File "/workspace/tvm/src/runtime/rpc/rpc_endpoint.cc", line 376
   RPCError: Error caught from RPC call:
   
   ```
     For a reason something is going wrong with  
tvm.micro.create_local_graph_executor() function. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@tvm.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to