sxjscience opened a new issue #19463:
URL: https://github.com/apache/incubator-mxnet/issues/19463


   ## Description
   Reproducible example:
   
   ```python
   import mxnet as mx
   from mxnet.gluon import nn
   from mxnet import amp
   mx.npx.set_np()
   amp.init()
   
   
   class Foo(nn.HybridBlock):
       def __init__(self, **kwargs):
           super().__init__(**kwargs)
           self.dense0 = nn.Dense(16, in_units=8)
   
       def forward(self, x):
           return mx.np.concatenate([self.dense0(x), x], axis=-1)
   
   foo = Foo()
   foo.initialize(ctx=mx.gpu())
   
   data = mx.np.random.normal(0, 1, (32, 8), ctx=mx.gpu())
   out = foo(data)
   print(out.dtype)
   ```
   
   Output:
   
   ```
   ---------------------------------------------------------------------------
   MXNetError                                Traceback (most recent call last)
   <ipython-input-9-07e1c93ce642> in <module>
        18 
        19 data = mx.np.random.normal(0, 1, (32, 8), ctx=mx.gpu())
   ---> 20 out = foo(data)
        21 print(out.dtype)
   
   ~/.local/lib/python3.6/site-packages/mxnet/gluon/block.py in __call__(self, 
x, *args)
      1419             if not self._active:
      1420                 # Normal imperative computation of forward()
   -> 1421                 return super().__call__(x, *args)
      1422 
      1423             if dc.is_deferred_compute():
   
   ~/.local/lib/python3.6/site-packages/mxnet/gluon/block.py in __call__(self, 
*args)
       709             hook(self, args)
       710 
   --> 711         out = self.forward(*args)
       712 
       713         for hook in self._forward_hooks.values():
   
   <ipython-input-9-07e1c93ce642> in forward(self, x)
        12 
        13     def forward(self, x):
   ---> 14         return mx.np.concatenate([self.dense0(x), x], axis=-1)
        15 
        16 foo = Foo()
   
   ~/.local/lib/python3.6/site-packages/mxnet/numpy/multiarray.py in 
concatenate(seq, axis, out)
      6520     array([1., 2., 3., 4., 5., 6.])
      6521     """
   -> 6522     return _mx_nd_np.concatenate(seq, axis=axis, out=out)
      6523 
      6524 
   
   ~/.local/lib/python3.6/site-packages/mxnet/ndarray/numpy/_op.py in 
concatenate(seq, axis, out)
      4522            [3., 4., 6.]])
      4523     """
   -> 4524     return _api_internal.concatenate(*seq, axis, out)
      4525 
      4526 
   
   ~/.local/lib/python3.6/site-packages/mxnet/_ffi/_ctypes/function.py in 
__call__(self, *args)
       113                 self.handle, values, tcodes, ctypes.c_int(num_args),
       114                 ctypes.byref(ret_val), ctypes.byref(ret_tcode)) != 0:
   --> 115             raise get_last_ffi_error()
       116         _ = temp_args
       117         _ = args
   
   MXNetError: MXNetError: Type inconsistent, Provided = float32, inferred type 
= float16
   ```
   
   @ mk-61 Would you take a look on this? I'm not super familiar with the amp 
code so it may take me more time to resolve this issue.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to