Hi,
I am doing a POC in which I have implemented custom Spark Listener.
I have overridden methods such as onTaskEnd(taskEnd:
SparkListenerTaskEnd),onStageCompleted(stageCompleted:
SparkListenerStageCompleted),etc.
from which I get information such as
taskId,recordsWritten,stageId,recordsRead,etc.
But I am not able to identify the dataframe executing in a task. 
For eg: I need to identify the dataframe where input file is read or task in
which dataframes are joined.

Can someone suggest me some solution for above use case where I can get
dataframes information in executing tasks?
Thanks




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Identifying-DataFrames-in-executing-tasks-tp28176.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to