kazuyukitanimura commented on code in PR #422: URL: https://github.com/apache/datafusion-comet/pull/422#discussion_r1605660821
########## docs/source/contributor-guide/adding_a_new_expression.md: ########## @@ -0,0 +1,212 @@ +<!--- + Licensed to the Apache Software Foundation (ASF) under one + or more contributor license agreements. See the NOTICE file + distributed with this work for additional information + regarding copyright ownership. The ASF licenses this file + to you under the Apache License, Version 2.0 (the + "License"); you may not use this file except in compliance + with the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, + software distributed under the License is distributed on an + "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + KIND, either express or implied. See the License for the + specific language governing permissions and limitations + under the License. +--> + +# Adding a Expression + +There are a number of Spark expression that are not supported by DataFusion Comet yet, and implementing them is a good way to contribute to the project. + +Before you start, have a look through [these slides](https://docs.google.com/presentation/d/1H0fF2MOkkBK8fPBlnqK6LejUeLcVD917JhVWfp3mb8A/edit#slide=id.p) as they provide a conceptual overview. And a video of a presentation on those slides is available [here](https://drive.google.com/file/d/1POU4lFAZfYwZR8zV1X2eoLiAmc1GDtSP/view?usp=sharing). + +## Finding an Expression to Add + +You may have a specific expression in mind that you'd like to add, but if not, you can review the [expression coverage document](https://github.com/apache/datafusion-comet/blob/f08fcadd5fbdb5b04293d33e654f6c16f81b70c4/doc/spark_builtin_expr_coverage.txt) to see which expressions are not yet supported. + +## Adding the Expression + +Once you have the expression you'd like to add, you should take inventory of the following: + +1. What is the Spark expressions behavior across different Spark versions? These make good test cases, and will inform you of any compatibility issues such as an API change that will have to be addressed. +2. Check if the expression is already implemented in DataFusion and if the is compatible with the Spark expression. + 1. If it is, you can potentially reuse the existing implementation though you'll need to add tests to verify compatibility. + 2. If it's not, consider an initial version in DataFusion Comet and potentially backport it into DataFusion if the expression would be supported by it. +3. Test cases for the expression. As mentioned, you can refer to Spark's test cases for a good idea of what to test. + +Once you know what you want to add, you'll need to update the query planner to recognize the new expression in Scala and potentially add a new expression implementation `core/` in Rust. + +### Adding the Expression in Scala + +The `QueryPlanSerde` object has a method `exprToProto`, which is responsible for converting a Spark expression to a protobuf expression. Within that method is an `exprToProtoInternal` method that contains a large match statement for each expression type. You'll need to add a new case to this match statement for your new expression. + +For example, the `unhex` function looks like this: + +```scala +case e: Unhex if !isSpark32 => + val unHex = unhexSerde(e) + + val childExpr = exprToProtoInternal(unHex._1, inputs) + val failOnErrorExpr = exprToProtoInternal(unHex._2, inputs) + + val optExpr = + scalarExprToProtoWithReturnType("unhex", e.dataType, childExpr, failOnErrorExpr) + optExprWithInfo(optExpr, expr, unHex._1) +``` + +A few things to note here: + +* The `isSpark32` check is used to fall back to Spark's implementation of `unhex` in Spark 3.2, as only versions after that have the `failOnError` parameter. Review Comment: For this example, we have the shim for `failOnError`, so it is not the reason for using `isSpark32`? ########## docs/source/contributor-guide/adding_a_new_expression.md: ########## @@ -0,0 +1,212 @@ +<!--- + Licensed to the Apache Software Foundation (ASF) under one + or more contributor license agreements. See the NOTICE file + distributed with this work for additional information + regarding copyright ownership. The ASF licenses this file + to you under the Apache License, Version 2.0 (the + "License"); you may not use this file except in compliance + with the License. You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, + software distributed under the License is distributed on an + "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY + KIND, either express or implied. See the License for the + specific language governing permissions and limitations + under the License. +--> + +# Adding a Expression + +There are a number of Spark expression that are not supported by DataFusion Comet yet, and implementing them is a good way to contribute to the project. + +Before you start, have a look through [these slides](https://docs.google.com/presentation/d/1H0fF2MOkkBK8fPBlnqK6LejUeLcVD917JhVWfp3mb8A/edit#slide=id.p) as they provide a conceptual overview. And a video of a presentation on those slides is available [here](https://drive.google.com/file/d/1POU4lFAZfYwZR8zV1X2eoLiAmc1GDtSP/view?usp=sharing). + +## Finding an Expression to Add + +You may have a specific expression in mind that you'd like to add, but if not, you can review the [expression coverage document](https://github.com/apache/datafusion-comet/blob/f08fcadd5fbdb5b04293d33e654f6c16f81b70c4/doc/spark_builtin_expr_coverage.txt) to see which expressions are not yet supported. + +## Adding the Expression + +Once you have the expression you'd like to add, you should take inventory of the following: + +1. What is the Spark expressions behavior across different Spark versions? These make good test cases, and will inform you of any compatibility issues such as an API change that will have to be addressed. +2. Check if the expression is already implemented in DataFusion and if the is compatible with the Spark expression. + 1. If it is, you can potentially reuse the existing implementation though you'll need to add tests to verify compatibility. + 2. If it's not, consider an initial version in DataFusion Comet and potentially backport it into DataFusion if the expression would be supported by it. Review Comment: nit: I would say something like `If it's not, consider an initial version in DataFusion for expressions that are common across different engines. For expressions that are specific to Spark, consider an initial version in DataFusion Comet.` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: github-unsubscr...@datafusion.apache.org For additional commands, e-mail: github-h...@datafusion.apache.org