[ https://issues.apache.org/jira/browse/SPARK-37971?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Carlos Gameiro updated SPARK-37971: ----------------------------------- Description: This functionality would serve very specific use cases. Consider a DataFrame with a column of SQL expressions encoded as strings. Individually it's possible to evaluate each string and obtain the corresponding result. However it is not possible to apply the expr function row-wise (UDF or map), and evaluate all expression efficiently. id | sql_expression 1 | abs(-1) + 12 2 | decode(1,2,3,4) - 1 3 | 30 * 20 - 5 df = df.withColumn('sql_eval', f.expr_row('sql_expression')) was: This functionality would serve very specific use cases. Consider a DataFrame with a column of SQL expressions encoded as strings. Individually it's possible to evaluate each string and obtain the corresponding result. However it is not possible to apply the expr function row-wise (UDF or map), and evaluate all expression efficiently. > Apply and evaluate expressiosn row-wise in a DataFrame > ------------------------------------------------------ > > Key: SPARK-37971 > URL: https://issues.apache.org/jira/browse/SPARK-37971 > Project: Spark > Issue Type: Improvement > Components: PySpark > Affects Versions: 3.2.0 > Reporter: Carlos Gameiro > Priority: Critical > > This functionality would serve very specific use cases. > Consider a DataFrame with a column of SQL expressions encoded as strings. > Individually it's possible to evaluate each string and obtain the > corresponding result. However it is not possible to apply the expr function > row-wise (UDF or map), and evaluate all expression efficiently. > id | sql_expression > 1 | abs(-1) + 12 > 2 | decode(1,2,3,4) - 1 > 3 | 30 * 20 - 5 > df = df.withColumn('sql_eval', f.expr_row('sql_expression')) -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org