tustvold commented on issue #2837:
URL: https://github.com/apache/arrow-rs/issues/2837#issuecomment-1271416479

   > Would it be better to simply not have the dyn_primitive_scalar kernels and 
instead use docstrings or something else to show how to the kernels with 
primitives?
   
   The issue isn't documentation, but that the kernels are inconsistent in 
their behaviour with respect to the non-scalar kernels, and the arithmetic 
comparison scalar kenels. If I try to add an `Int32Array` to a `Float32Array` I 
will get an error, however, currently if I try to add a `f32` to an 
`Int32Array` it will coerce the float to an integer and add it. This at best 
rather surprising, at worst a subtle source of strange bugs.
   
   Further the encoding using `ToPrimitive` cannot be generalised to types 
other than those supported by `ToPrimitive`, i.e. Rust built-in types. This 
effectively blocks implementing these kernels for `i256`, i.e. `Decimal256`.
   
   > which also gives some background on why we didn't go with the T::Native 
approach
   
   The key comment appears to be 
https://github.com/apache/arrow-rs/pull/1074#issuecomment-999105444. Is there 
any possibility I might be able to shift your feeling on this matter? Perhaps 
we could have a synchronous chat? Looking at DataFusion, `ScalarValue` is 
already concretely typed to match the `ArrowPrimitiveType` and not 
`ArrowNativeType`, i.e. `ScalarValue::TimestampMillisecond` instead of 
`ScalarValue::I32(_)`, and so this change shouldn't materially impact its 
complexity - it already knows what the concrete type should be
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to