Thank Rahul I think you didn't read question properly I have one main spark
job which I name using the approach you described. As part of main spark
job I create multiple threads which essentially becomes child spark jobs
and those jobs has no direct way of naming.
On Jul 27, 2016 11:17,
You can set name in SparkConf() or if You are using Spark submit set --name
flag
*val sparkconf = new SparkConf()*
* .setMaster("local[4]")*
* .setAppName("saveFileJob")*
*val sc = new SparkContext(sparkconf)*
or spark-submit :
*./bin/spark-submit --name "FileSaveJob"
As far as I know, the best you can do is refer to the Actions by line number.
> On Jul 23, 2016, at 8:47 AM, unk1102 wrote:
>
> Hi I have multiple child spark jobs run at a time. Is there any way to name
> these child spark jobs so I can identify slow running ones. For e.
Hi I have multiple child spark jobs run at a time. Is there any way to name
these child spark jobs so I can identify slow running ones. For e. g.
xyz_saveAsTextFile(), abc_saveAsTextFile() etc please guide. Thanks in
advance.
--
View this message in context: