Spark version is 3.2.2
I want to observe row count before write to jdbc using observation:
ds.observe("stepName", F.count(F.lit(1)).as("_rc"))
And I add QueryExecutionListener to my spark session. code like :
val metrics = qe.observedMetrics
> logInfo(s"Function $funcName success. Try send metrics size: ${
> metrics.size}")
>
when i write data to jdbc. Log print as:
> Function command success. Try send metrics size: 0
But when i write data to hdfs. *I can get the metric of row count*.
I tried use *org.apache.spark.sql.Observation*. However, it is also not
possible to get jdbc's observation information, but it is possible to get
hdfs's.
Whether there is a solution?