reminisce commented on a change in pull request #10433: [MXNET-290] MKLDNN 
support for model quantization
URL: https://github.com/apache/incubator-mxnet/pull/10433#discussion_r188752161
 
 

 ##########
 File path: include/mxnet/c_api.h
 ##########
 @@ -1423,13 +1423,15 @@ MXNET_DLL int MXSymbolInferType(SymbolHandle sym,
  * \param excluded_symbols array of symbols to be excluded from being quantized
  * \param num_offline number of parameters that are quantized offline
  * \param offline_params array of c strings representing the names of params 
quantized offline
+ * \param dev_type device type 
  */
 MXNET_DLL int MXQuantizeSymbol(SymbolHandle sym_handle,
                                SymbolHandle *ret_sym_handle,
                                const mx_uint num_excluded_symbols,
                                const SymbolHandle *excluded_symbols,
                                const mx_uint num_offline,
-                               const char **offline_params);
+                               const char **offline_params,
+                               int dev_type);
 
 Review comment:
   @jinhuang415 I think this design is clearer than before. I agree with adding 
config params to the interface. One minor suggestion, instead of defining 
boolean value `use_uint8`, it's more general to define a param such as 
`quantized_dtype` default to `int8`. This allows us to pass other low-precision 
data types to the backend.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to