ptrendx commented on a change in pull request #14153: Fix shape inference pass
URL: https://github.com/apache/incubator-mxnet/pull/14153#discussion_r257101056
 
 

 ##########
 File path: src/executor/infer_graph_attr_pass.cc
 ##########
 @@ -282,26 +285,28 @@ nnvm::Graph InferAttr(nnvm::Graph &&ret,
   };
 
   size_t last_num_unknown;
-  size_t num_unknown_dispatch_mode = dispatch_mode_name ? node_end - 
node_start : 0;
-  size_t num_unknown_entry_attr = entry_end - entry_start;
-  size_t num_unknown = num_unknown_entry_attr + num_unknown_dispatch_mode;
+  size_t num_unknown = static_cast<size_t>(-1);  // Infinity
+
+  if (scalars_only) {
+    size_t num_unknown_dispatch_mode = dispatch_mode_name ? node_end - 
node_start : 0;
+    size_t num_unknown_entry_attr = entry_end - entry_start;
+    num_unknown = num_unknown_entry_attr + num_unknown_dispatch_mode;
+  }
   int i = 0;
   do {
-    if (i % 2 == 0) {
 
 Review comment:
   As I've written in the description `Change to the way passes are done so 
that both forward and backward inference is performed every time - I'm not sure 
if this is necessary - @eric-haibin-lin, thoughts?` - I'm not sure if this is 
necessary. I did remove it because it seemed like a bug - let's say you only 
have the information on the output instead of input, I was worried that then 
the inference pass would stop after just going forward, which would not infer 
anything. As I think about this more though, I think that even on the forward 
side the last op will actually decrease number of unknowns and so the `i = 1` 
step will be invoked. 
   If that is true, then this `if` statement is correct there and I will put it 
back.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to