kumarUjjawal commented on code in PR #19871:
URL: https://github.com/apache/datafusion/pull/19871#discussion_r2702491427
##########
datafusion/functions/src/math/signum.rs:
##########
@@ -98,6 +98,34 @@ impl ScalarUDFImpl for SignumFunc {
}
fn invoke_with_args(&self, args: ScalarFunctionArgs) ->
Result<ColumnarValue> {
+ let arg = &args.args[0];
+
+ // Scalar fast path for float types - avoid array conversion overhead
+ if let ColumnarValue::Scalar(scalar) = arg {
+ if scalar.is_null() {
+ return ColumnarValue::Scalar(ScalarValue::Null)
+ .cast_to(args.return_type(), None);
+ }
+
+ match scalar {
+ ScalarValue::Float64(Some(v)) => {
+ let result = if *v == 0.0 { 0.0 } else { v.signum() };
+ return
Ok(ColumnarValue::Scalar(ScalarValue::Float64(Some(result))));
+ }
+ ScalarValue::Float32(Some(v)) => {
+ let result = if *v == 0.0 { 0.0 } else { v.signum() };
+ return
Ok(ColumnarValue::Scalar(ScalarValue::Float32(Some(result))));
+ }
+ _ => {
+ return internal_err!(
+ "Unexpected scalar type for signum: {:?}",
+ scalar.data_type()
+ );
+ }
+ }
+ }
+
+ // Array path
make_scalar_function(signum, vec![])(&args.args)
Review Comment:
If my interpretation is correct, you are asking: To add scalar optimization
inside make_scalar_function? To do that we would need to change the signature
to also accept a scalar function, which would be a larger refactor. If you
meant that Doesn't make_scalar_function already handle scalar optimization?
Then no we still need to convert scalars to arrays first. We have used the
inline path in other parts of the optimization too.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]