[ 
https://issues.apache.org/jira/browse/SPARK-17728?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15547358#comment-15547358
 ] 

Herman van Hovell commented on SPARK-17728:
-------------------------------------------

The second one does not trigger the behavior because this is turned into a 
LocalRelation, these are evaluated during optimization and before we collapse 
projections.

The following in memory dataframe should trigger the same behavior:
{noformat}
spark.range(1, 10).withColumn("expensive_udf_result", 
fUdf($"id")).withColumn("b", $"expensive_udf_result" + 100)
{noformat}

> UDFs are run too many times
> ---------------------------
>
>                 Key: SPARK-17728
>                 URL: https://issues.apache.org/jira/browse/SPARK-17728
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.0
>         Environment: Databricks Cloud / Spark 2.0.0
>            Reporter: Jacob Eisinger
>            Priority: Minor
>         Attachments: over_optimized_udf.html
>
>
> h3. Background
> Llonger running processes that might run analytics or contact external 
> services from UDFs. The response might not just be a field, but instead a 
> structure of information. When attempting to break out this information, it 
> is critical that query is optimized correctly.
> h3. Steps to Reproduce
> # Create some sample data.
> # Create a UDF that returns a multiple attributes.
> # Run UDF over some data.
> # Create new columns from the multiple attributes.
> # Observe run time.
> h3. Actual Results
> The UDF is executed *multiple times* _per row._
> h3. Expected Results
> The UDF should only be executed *once* _per row._
> h3. Workaround
> Cache the Dataset after UDF execution.
> h3. Details
> For code and more details, see [^over_optimized_udf.html]



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to