[ 
https://issues.apache.org/jira/browse/SPARK-39832?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Maciej Szymkiewicz resolved SPARK-39832.
----------------------------------------
    Fix Version/s: 3.4.0
       Resolution: Fixed

Issue resolved by pull request 37329
[https://github.com/apache/spark/pull/37329]

> regexp_replace should support column arguments
> ----------------------------------------------
>
>                 Key: SPARK-39832
>                 URL: https://issues.apache.org/jira/browse/SPARK-39832
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 3.3.0
>            Reporter: Brian Schaefer
>            Assignee: Brian Schaefer
>            Priority: Major
>              Labels: starter
>             Fix For: 3.4.0
>
>
> {{F.regexp_replace}} in PySpark currently only supports strings for the 
> second and third argument: 
> [https://github.com/apache/spark/blob/1df6006ea977ae3b8c53fe33630e277e8c1bc49c/python/pyspark/sql/functions.py#L3265]
> In Scala, columns are also supported: 
> [https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L2836|https://github.com/apache/spark/blob/1df6006ea977ae3b8c53fe33630e277e8c1bc49c/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L2836]
> The desire to use columns as arguments for the function has been raised 
> previously on StackExchange: 
> [https://stackoverflow.com/questions/64613761/in-pyspark-using-regexp-replace-how-to-replace-a-group-with-value-from-another|https://stackoverflow.com/questions/64613761/in-pyspark-using-regexp-replace-how-to-replace-a-group-with-value-from-another,],
>  where the suggested fix was to use {{{}F.expr{}}}.
> It should be relatively straightforward to support in PySpark the two 
> function signatures supported in Scala.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to