[ 
https://issues.apache.org/jira/browse/SPARK-35695?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tanel Kiis updated SPARK-35695:
-------------------------------
    Description: 
This example properly fires the event
{code}
spark.range(100)
  .observe(
    name = "other_event",
    avg($"id").cast("int").as("avg_val"))
  .collect()
{code}

But when I add persist, then no event is fired or seen (not sure which):
{code}
spark.range(100)
  .observe(
    name = "my_event",
    avg($"id").cast("int").as("avg_val"))
  .persist()
  .collect()
{code}

  was:
This example properly fires the event
{code}
spark.range(100)
  .observe(
    name = "other_event",
    avg($"id").cast("int").as("avg_val"))
  .collect()
{code}

But when I add persist, then no event is fired or seen (not sure which):
{code}
spark.range(100)
  .observe(
    name = "other_event",
    avg($"id").cast("int").as("avg_val"))
  .persist()
  .collect()
{code}


> QueryExecutionListener does not see any observed metrics fired before 
> persist/cache
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-35695
>                 URL: https://issues.apache.org/jira/browse/SPARK-35695
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Tanel Kiis
>            Priority: Major
>
> This example properly fires the event
> {code}
> spark.range(100)
>   .observe(
>     name = "other_event",
>     avg($"id").cast("int").as("avg_val"))
>   .collect()
> {code}
> But when I add persist, then no event is fired or seen (not sure which):
> {code}
> spark.range(100)
>   .observe(
>     name = "my_event",
>     avg($"id").cast("int").as("avg_val"))
>   .persist()
>   .collect()
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to