[jira] [Assigned] (SPARK-37578) DSV2 is not updating Output Metrics
[ https://issues.apache.org/jira/browse/SPARK-37578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] L. C. Hsieh reassigned SPARK-37578: --- Assignee: L. C. Hsieh > DSV2 is not updating Output Metrics > --- > > Key: SPARK-37578 > URL: https://issues.apache.org/jira/browse/SPARK-37578 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.0.3, 3.1.2 >Reporter: Sandeep Katta >Assignee: L. C. Hsieh >Priority: Major > > Repro code > ./bin/spark-shell --master local --jars > /Users/jars/iceberg-spark3-runtime-0.12.1.jar > > {code:java} > import scala.collection.mutable > import org.apache.spark.scheduler._val bytesWritten = new > mutable.ArrayBuffer[Long]() > val recordsWritten = new mutable.ArrayBuffer[Long]() > val bytesWrittenListener = new SparkListener() { > override def onTaskEnd(taskEnd: SparkListenerTaskEnd): Unit = { > bytesWritten += taskEnd.taskMetrics.outputMetrics.bytesWritten > recordsWritten += taskEnd.taskMetrics.outputMetrics.recordsWritten > } > } > spark.sparkContext.addSparkListener(bytesWrittenListener) > try { > val df = spark.range(1000).toDF("id") > df.write.format("iceberg").save("Users/data/dsv2_test") > > assert(bytesWritten.sum > 0) > assert(recordsWritten.sum > 0) > } finally { > spark.sparkContext.removeSparkListener(bytesWrittenListener) > } {code} > > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-37578) DSV2 is not updating Output Metrics
[ https://issues.apache.org/jira/browse/SPARK-37578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-37578: Assignee: Apache Spark > DSV2 is not updating Output Metrics > --- > > Key: SPARK-37578 > URL: https://issues.apache.org/jira/browse/SPARK-37578 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.0.3, 3.1.2 >Reporter: Sandeep Katta >Assignee: Apache Spark >Priority: Major > > Repro code > ./bin/spark-shell --master local --jars > /Users/jars/iceberg-spark3-runtime-0.12.1.jar > > {code:java} > import scala.collection.mutable > import org.apache.spark.scheduler._val bytesWritten = new > mutable.ArrayBuffer[Long]() > val recordsWritten = new mutable.ArrayBuffer[Long]() > val bytesWrittenListener = new SparkListener() { > override def onTaskEnd(taskEnd: SparkListenerTaskEnd): Unit = { > bytesWritten += taskEnd.taskMetrics.outputMetrics.bytesWritten > recordsWritten += taskEnd.taskMetrics.outputMetrics.recordsWritten > } > } > spark.sparkContext.addSparkListener(bytesWrittenListener) > try { > val df = spark.range(1000).toDF("id") > df.write.format("iceberg").save("Users/data/dsv2_test") > > assert(bytesWritten.sum > 0) > assert(recordsWritten.sum > 0) > } finally { > spark.sparkContext.removeSparkListener(bytesWrittenListener) > } {code} > > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-37578) DSV2 is not updating Output Metrics
[ https://issues.apache.org/jira/browse/SPARK-37578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Apache Spark reassigned SPARK-37578: Assignee: (was: Apache Spark) > DSV2 is not updating Output Metrics > --- > > Key: SPARK-37578 > URL: https://issues.apache.org/jira/browse/SPARK-37578 > Project: Spark > Issue Type: Bug > Components: Spark Core >Affects Versions: 3.0.3, 3.1.2 >Reporter: Sandeep Katta >Priority: Major > > Repro code > ./bin/spark-shell --master local --jars > /Users/jars/iceberg-spark3-runtime-0.12.1.jar > > {code:java} > import scala.collection.mutable > import org.apache.spark.scheduler._val bytesWritten = new > mutable.ArrayBuffer[Long]() > val recordsWritten = new mutable.ArrayBuffer[Long]() > val bytesWrittenListener = new SparkListener() { > override def onTaskEnd(taskEnd: SparkListenerTaskEnd): Unit = { > bytesWritten += taskEnd.taskMetrics.outputMetrics.bytesWritten > recordsWritten += taskEnd.taskMetrics.outputMetrics.recordsWritten > } > } > spark.sparkContext.addSparkListener(bytesWrittenListener) > try { > val df = spark.range(1000).toDF("id") > df.write.format("iceberg").save("Users/data/dsv2_test") > > assert(bytesWritten.sum > 0) > assert(recordsWritten.sum > 0) > } finally { > spark.sparkContext.removeSparkListener(bytesWrittenListener) > } {code} > > -- This message was sent by Atlassian Jira (v8.20.1#820001) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org