How to give name to Spark jobs shown in Spark UI
Hi I have multiple child spark jobs run at a time. Is there any way to name these child spark jobs so I can identify slow running ones. For e. g. xyz_saveAsTextFile(), abc_saveAsTextFile() etc please guide. Thanks in advance. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: How to give name to Spark jobs shown in Spark UI
As far as I know, the best you can do is refer to the Actions by line number. > On Jul 23, 2016, at 8:47 AM, unk1102 wrote: > > Hi I have multiple child spark jobs run at a time. Is there any way to name > these child spark jobs so I can identify slow running ones. For e. g. > xyz_saveAsTextFile(), abc_saveAsTextFile() etc please guide. Thanks in > advance. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe e-mail: user-unsubscr...@spark.apache.org > - To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Re: How to give name to Spark jobs shown in Spark UI
You can set name in SparkConf() or if You are using Spark submit set --name flag *val sparkconf = new SparkConf()* * .setMaster("local[4]")* * .setAppName("saveFileJob")* *val sc = new SparkContext(sparkconf)* or spark-submit : *./bin/spark-submit --name "FileSaveJob" --master local[4] fileSaver.jar* On Mon, Jul 25, 2016 at 9:46 PM, neil90 [via Apache Spark User List] < ml-node+s1001560n27406...@n3.nabble.com> wrote: > As far as I know you can give a name to the SparkContext. I recommend > using a cluster monitoring tool like Ganglia to determine were its slow in > your spark jobs. > > -- > If you reply to this email, your message will be added to the discussion > below: > > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27406.html > To start a new topic under Apache Spark User List, email > ml-node+s1001560n1...@n3.nabble.com > To unsubscribe from Apache Spark User List, click here > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=1&code=cmFodWxrdW1hci5hd3NAZ21haWwuY29tfDF8LTEzNTczMzg4MjQ=> > . > NAML > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml> > - Software Developer Sigmoid (SigmoidAnalytics), India -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27414.html Sent from the Apache Spark User List mailing list archive at Nabble.com.
Re: How to give name to Spark jobs shown in Spark UI
Thank Rahul I think you didn't read question properly I have one main spark job which I name using the approach you described. As part of main spark job I create multiple threads which essentially becomes child spark jobs and those jobs has no direct way of naming. On Jul 27, 2016 11:17, "rahulkumar-aws [via Apache Spark User List]" < ml-node+s1001560n27414...@n3.nabble.com> wrote: > You can set name in SparkConf() or if You are using Spark submit set > --name flag > > *val sparkconf = new SparkConf()* > * .setMaster("local[4]")* > * .setAppName("saveFileJob")* > *val sc = new SparkContext(sparkconf)* > > > or spark-submit : > > *./bin/spark-submit --name "FileSaveJob" --master local[4] fileSaver.jar* > > > > > On Mon, Jul 25, 2016 at 9:46 PM, neil90 [via Apache Spark User List] <[hidden > email] <http:///user/SendEmail.jtp?type=node&node=27414&i=0>> wrote: > >> As far as I know you can give a name to the SparkContext. I recommend >> using a cluster monitoring tool like Ganglia to determine were its slow in >> your spark jobs. >> >> ---------- >> If you reply to this email, your message will be added to the discussion >> below: >> >> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27406.html >> To start a new topic under Apache Spark User List, email [hidden email] >> <http:///user/SendEmail.jtp?type=node&node=27414&i=1> >> To unsubscribe from Apache Spark User List, click here. >> NAML >> <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml> >> > > Software Developer Sigmoid (SigmoidAnalytics), India > > > -------------- > If you reply to this email, your message will be added to the discussion > below: > > http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27414.html > To unsubscribe from How to give name to Spark jobs shown in Spark UI, click > here > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code&node=27400&code=dW1lc2gua2FjaGFAZ21haWwuY29tfDI3NDAwfDEwMTUyMzU4ODk=> > . > NAML > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewer&id=instant_html%21nabble%3Aemail.naml&base=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespace&breadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml> > -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-give-name-to-Spark-jobs-shown-in-Spark-UI-tp27400p27415.html Sent from the Apache Spark User List mailing list archive at Nabble.com.