Re: How to give name to Spark jobs shown in Spark UI

2016-07-27 Thread unk1102
Thank Rahul I think you didn't read question properly I have one main spark
job which I name using the approach you described. As part of main spark
job I create multiple threads which essentially becomes child spark jobs
and those jobs has no direct way of naming.

On Jul 27, 2016 11:17, "rahulkumar-aws [via Apache Spark User List]" <
ml-node+s1001560n27414...@n3.nabble.com> wrote:

> You can set name in SparkConf() or if You are using Spark submit set
> --name flag
>
> *val sparkconf = new SparkConf()*
> * .setMaster("local[4]")*
> * .setAppName("saveFileJob")*
> *val sc = new SparkContext(sparkconf)*
>
>
> or spark-submit :
>
> *./bin/spark-submit --name "FileSaveJob" --master local[4]  fileSaver.jar*
>
>
>
>
> On Mon, Jul 25, 2016 at 9:46 PM, neil90 [via Apache Spark User List] <[hidden
> email] > wrote:
>
>> As far as I know you can give a name to the SparkContext. I recommend
>> using a cluster monitoring tool like Ganglia to determine were its slow in
>> your spark jobs.
>>
>> --
>> If you reply to this email, your message will be added to the discussion
>> below:
>>
>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27406.html
>> To start a new topic under Apache Spark User List, email [hidden email]
>> 
>> To unsubscribe from Apache Spark User List, click here.
>> NAML
>> 
>>
>
> Software Developer Sigmoid (SigmoidAnalytics), India
>
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27414.html
> To unsubscribe from How to give name to Spark jobs shown in Spark UI, click
> here
> 
> .
> NAML
> 
>




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27415.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How to give name to Spark jobs shown in Spark UI

2016-07-26 Thread rahulkumar-aws
You can set name in SparkConf() or if You are using Spark submit set --name
flag

*val sparkconf = new SparkConf()*
* .setMaster("local[4]")*
* .setAppName("saveFileJob")*
*val sc = new SparkContext(sparkconf)*


or spark-submit :

*./bin/spark-submit --name "FileSaveJob" --master local[4]  fileSaver.jar*




On Mon, Jul 25, 2016 at 9:46 PM, neil90 [via Apache Spark User List] <
ml-node+s1001560n27406...@n3.nabble.com> wrote:

> As far as I know you can give a name to the SparkContext. I recommend
> using a cluster monitoring tool like Ganglia to determine were its slow in
> your spark jobs.
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
>
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27406.html
> To start a new topic under Apache Spark User List, email
> ml-node+s1001560n1...@n3.nabble.com
> To unsubscribe from Apache Spark User List, click here
> 
> .
> NAML
> 
>




-
Software Developer
Sigmoid (SigmoidAnalytics), India

--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27414.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: How to give name to Spark jobs shown in Spark UI

2016-07-23 Thread Andrew Ehrlich
As far as I know, the best you can do is refer to the Actions by line number.

> On Jul 23, 2016, at 8:47 AM, unk1102  wrote:
> 
> Hi I have multiple child spark jobs run at a time. Is there any way to name
> these child spark jobs so I can identify slow running ones. For e. g.
> xyz_saveAsTextFile(),  abc_saveAsTextFile() etc please guide. Thanks in
> advance. 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



How to give name to Spark jobs shown in Spark UI

2016-07-23 Thread unk1102
Hi I have multiple child spark jobs run at a time. Is there any way to name
these child spark jobs so I can identify slow running ones. For e. g.
xyz_saveAsTextFile(),  abc_saveAsTextFile() etc please guide. Thanks in
advance. 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org